Abstract: Disclosed is a system and method (400) for executing video tests over a network connection. The method comprises controlling an application interface to display a work order window including options for creating work orders, and obtaining a set of inputs corresponding to the options in the work order window. Based on the obtained set of inputs the work orders related to the video test is created and stored in a database. The method further comprises retrieving the work orders from the database at the scheduled start time, and transmitting, over the network connection, the retrieved work orders to client devices (110) to execute the video test. In response to the transmitted work orders, data including execution results of the video test is received from the client devices, and thereafter the method comprises generating a performance report including performance metrics and indicators related to the execution of the video test. FIG. 3
DESC:FORM 2
THE PATENTS ACT, 1970 (39 OF 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
SYSTEM AND METHOD FOR REMOTELY EXECUTING VIDEO TESTS OVER USER NETWORK CONNECTIONS
Jio Platforms Limited, an Indian company, having registered address at Office -102, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
The following specification particularly describes th e invention and the manner in which it is to be performed.
TECHNICAL FIELD
[0001] The embodiments of the present disclosure generally relate to the field of wireless communications and network performance testing. More particularly, the present disclosure relates to a system and a method for remotely executing video tests on electronic device(s) over a network connection using work orders.
BACKGROUND OF THE INVENTION
[0002] The subject matter disclosed in the background section should not be assumed or construed to be prior art merely because of its mention in the background section. Similarly, any problem statement mentioned in the background section or its association with the subject matter of the background section should not be assumed or construed to have been previously recognized in the prior art.
[0003] In the realm of wireless communication networks, with increased usage of internet and popularity of Over-The-Top (OTT) media content, there is an increase in demand for high-speed internet connectivity. Conventional protocols and methods for evaluating the performance of communication networks faced various challenges and limitations while performing a recipe test such as a video test over a user network connection to assess a Quality of Service (QoS) during peak usage hours. The challenges are more prevalent in regions where users frequently encounter inconsistent network performance and slow internet speeds during the peak usage hours. A poor QoS of network affects user experience and leads to frustration and disappointment towards a service being provided by a network and impacts an overall experience of users in the communication networks.
[0004] Heretofore, conventional methods for evaluating the performance of the communication networks relied on manual or localized recipe tests conducted by individual users. The manual recipe tests are limited in scope and effectiveness, especially when attempting to assess conditions of the communication network across a broad geographic area. Thus, the conventional methods have not proven to be successful in providing comprehensive insights into root causes of degradation in the performance of the communication networks during the peak usage hours or at specific locations .
[0005] Therefore, to overcome aforementioned challenges and limitations associated with the conventional methods, there lies a need for a system and a method that can remotely execute the video test on the user network connection, and address network performance issues effectively.
SUMMARY
[0006] The following embodiments present a simplified summary to provide a basic understanding of some aspects of the disclosed invention. This summary is not an extensive overview, and it is not intended to identify key/critical elements or to delineate the scope thereof. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
[0007] According to an embodiment, a method for executing a video test over a network connection is described. The method comprises controlling, by a display engine, an application interface to display a work order window including a plurality of options for creating at least one work order. Further, the method comprises obtaining, by a reception engine, a set of inputs corresponding to the plurality of options in the work order window. The set of inputs includes information of a scheduled start time for executing the video test and a test duration for executing the video test. Thereafter, the method comprises creating, by a work order management engine based on the obtained set of inputs, the at least one work order related to the video test, and storing, by the work order management engine, the at least one work order in a database. Further, the method comprises retrieving, by the work order management engine, the at least one work order from the database at the scheduled start time, and transmitting, by a transmitting engine over the network connection, the retrieved at least one work order to one or more client devices to execute the video test. Furthermore, the method comprises receiving, by the reception engine, data including execution results of the video test from the one or more client devices, and generating, by a report generation engine based on the received data, a performance report including performance metrics and indicators related to the execution of the video test.
[0008] In one or more aspects, the set of inputs further includes a unique identifier associated with each client device among the one or more client devices. The retrieved at least one work order is transmitted to the one or more client devices based on the unique identifier associated with corresponding client device. The video test is executed at the one or more client devices over the network connection till completion of the test duration for each iteration.
[0009] In an aspect, the method comprises a quality of video content being streamed on the one or more client devices is determined based on an analysis of the data including execution results of the video test. The method further comprises controlling, by an execution engine based on a current network status of the one or more client devices, one or more settings of the one or more client devices by sending a control signal to the one or more client devices to adjust the quality of the video content being streamed.
[0010] In one or more aspects, the plurality of options includes at least an option for selecting one or more video streaming applications, an option for selecting a number of iterations for executing the video test, an option for inputting Uniform Resource locators (URLs) of one or videos available in the one or more video¬ streaming applications, an option for selecting a streaming resolution of the one or more videos, an option for setting the start time of the execution of the video test, and an option for setting the test duration for executing the video test.
[0011] In an aspect, based on the data including the execution results of the video test, a performance of the network connection is determined by the report generation engine (330-4) with respect to one or more video parameters. The one or more video parameters include at least a buffering time, a count of one or more video stalling events, a video freezing ratio, and a video load time associated with one or videos on which the video test is executed.
[0012] In an aspect, the the execution results of the executed video test is synchronized in the database by the report generation engine, and the video test is executed in background of the one or more client devices using a video streaming application.
[0013] In one or more aspects, the video test is re-executed at the client device if network fluctuations exceed a predefined threshold during the test duration of the video test.
[0014] According to another embodiment, a system for executing a video test over a network connection is described. The system comprises a communication interface, a display engine, a reception engine, a work order management engine, a transmitting engine, and a report generation engine. The communication interface is configured to establish a connection with a network management device and a client device. The display engine is configured to control an application interface of the network management device to display a work order window including a plurality of options for creating at least one work order. The reception engine configured to obtain a set of inputs corresponding to the plurality of options in the work order window. The set of inputs includes information of a scheduled start time for executing the video test and a test duration for executing the video test. The work order management engine is configured to create the at least one work order related to the video test based on the obtained set of inputs, store the at least one work order in a database, and retrieve the at least one work order from the database at the scheduled start time. The transmitting engine is configured to transmit, over the network connection, the retrieved at least one work order to one or more client devices to execute the video test. The reception engine is further configured to receive data including execution results of the video test from the one or more client devices, and the report generation engine is configured to generate, based on the received data, a performance report including performance metrics and indicators related to the execution of the video test.
[0015] In an aspect, a quality of video content being streamed on the one or more client devices is determined based on an analysis of the data including the execution results of the video test, and the system further comprises an execution engine configured to control, based on a current network status of the one or more client devices, one or more settings of the one or more client devices by sending a control signal to the one or more client devices to adjust the quality of the video content being streamed.
[0016] In an aspect, the report generation engine is further configured to determine, based on the execution results of the video test, a performance of the network connection with respect to one or more video parameters. The one or more video parameters include at least a buffering time, a count of one or more video stalling events, a video freezing ratio, and a video load time associated with one or videos on which the video test is executed.
[0017] In an aspect, the report generation engine is further configured to the report generation engine is further configured to synchronize the execution results of the video test in the database, the video test is executed in background of the one or more client devices using a video streaming application.
[0018] According to yet another embodiment, disclosed is an electronic device comprising one or more primary processors communicatively coupled to one or more processors and the one or more primary processors are coupled with a memory. The memory stores instructions which when executed by the one or more primary processors causes the electronic device to: receive, from a server, at least one work order to execute a video test, wherein the work order includes test parameters specifying a video¬ streaming application, Uniform Resource locators (URLs) of one or videos available in the video¬ streaming application, a streaming resolution of the one or more videos, a scheduled start time for executing the video test, a number of iterations, and a test duration for executing the video test; parse the received at least one work order to extract the test parameters for executing the video test; schedule the video test at the scheduled start time using an alarm manager; retrieve at least one video file using the URLs of the one or videos included in the extracted test parameters; execute, using the video¬ streaming application, the video test at the scheduled start time by performing a plurality of execution operations that includes stream the at least one video file at a plurality of resolutions, detect one or more video stalling events during streaming of the at least one video file, calculate a video freezing ratio after streaming the at least one video file using total video freeze time and video duration, and measure a buffering time and a video load time during streaming of the at least one video file; aggregate execution results of the video test including the one or more video stalling events, the video freezing ratio, the buffering time, and the video load time; and transmit data including the aggregated execution results of the video test to the server for generating a performance report.
BRIEF DESCRIPTION OF DRAWINGS
[0019] Various embodiments disclosed herein will become better understood from the following detailed description when read with the accompanying drawings. The accompanying drawings constitute a part of the present disclosure and illustrate certain non-limiting embodiments of inventive concepts. Further, components and elements shown in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. For consistency and ease of understanding, similar components and elements are annotated by reference numerals in the exemplary drawings.
[0020] FIG. 1 illustrates a block diagram depicting an example system for executing video tests over network connections, in accordance with an example embodiment of the present disclosure.
[0021] FIG. 2 illustrates an example system architecture of an electronic device, in accordance with an embodiment of the present disclosure.
[0022] FIG. 3 illustrates a block diagram depicting a system architecture of a server, in accordance with an example embodiment of the present disclosure.
[0023] FIG. 4 illustrates a flowchart depicting a method for creating work orders for execution of the video tests over the network connections of the electronic device, in accordance with an embodiment of the present disclosure.
[0024] FIG. 5 illustrates an example UI for navigating to a work order module, in accordance with an embodiment of the present disclosure.
[0025] FIG. 6 illustrates an example UI for navigating to an application work order, in accordance with an embodiment of the present disclosure.
[0026] FIG. 7 illustrates an example UI including an option for creating a work order for execution of video tests at the electronic device, in accordance with an embodiment of the present disclosure.
[0027] FIG. 8 illustrates an example UI including options for receiving user inputs to create the work order for execution of the video tests, in accordance with an embodiment of the present disclosure.
[0028] FIG. 9 illustrates an example UI including options for receiving user inputs to create the work order for one or more electronic devices using device Identifiers (IDs) of the electronic devices, in accordance with an embodiment of the present disclosure.
[0029] FIG. 10 illustrates an example UI for downloading performance report corresponding to the executed video tests, in accordance with an embodiment of the present disclosure.
[0030] FIG. 11 illustrates a flowchart depicting a method for executing video tests on the electronic device over a network connection of the electronic device, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE INVENTION
[0031] Inventive concepts of the present disclosure will now be described more fully hereinafter with reference to the accompanying drawings, in which examples of one or more embodiments of inventive concepts are shown. Inventive concepts may, however, be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Further, the one or more embodiments disclosed herein are provided to describe the inventive concept thoroughly and completely, and to fully convey the scope of each of the present inventive concepts to those skilled in the art. Furthermore, it should be noted that the embodiments disclosed herein are not mutually exclusive concepts. Accordingly, one or more components from one embodiment may be tacitly assumed to be present or used in any other embodiment.
[0032] The following description presents various embodiments of the present disclosure. The embodiments disclosed herein are presented as teaching examples and are not to be construed as limiting the scope of the present disclosure. The present disclosure should in no way be limited to the illustrative implementations, drawings, and techniques illustrated below, including the exemplary design and implementation illustrated and described herein, but may be modified, omitted, or expanded upon without departing from the scope of the present disclosure.
[0033] The following description contains specific information pertaining to embodiments in the present disclosure. The detailed description uses the phrases “in some embodiments” which may each refer to one or more or all of the same or different embodiments. The term “some” as used herein is defined as “one, or more than one, or all.” Accordingly, the terms “one,” “more than one,” “more than one, but not all” or “all” would all fall under the definition of “some.” In view of the same, the terms, for example, “in an embodiment” refers to one embodiment and the term, for example, “in one or more embodiments” refers to “at least one embodiment, or more than one embodiment, or all embodiments.”
[0034] The term “comprising,” when utilized, means “including, but not necessarily limited to;” it specifically indicates open-ended inclusion in the so-described one or more listed features, elements in a combination, unless otherwise stated with limiting language. Furthermore, to the extent that the terms “includes,” “has,” “have,” “contains,” and other similar words are used in either the detailed description, such terms are intended to be inclusive in a manner similar to the term “comprising.”
[0035] In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features.
[0036] The description provided herein discloses exemplary embodiments only and is not intended to limit the scope, applicability, or configuration of the present disclosure. Rather, the foregoing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing any of the exemplary embodiments. Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it may be understood by one of the ordinary skilled in the art that the embodiments disclosed herein may be practiced without these specific details.
[0037] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein the description, the singular forms "a", "an", and "the" include plural forms unless the context of the invention indicates otherwise.
[0038] The terminology and structure employed herein are for describing, teaching, and illuminating some embodiments and their specific features and elements and do not limit, restrict, or reduce the scope of the present disclosure. Accordingly, unless otherwise defined, all terms, and especially any technical and/or scientific terms, used herein may be taken to have the same meaning as commonly understood by one having ordinary skill in the art.
[0039] Various aspects of the present disclosure illustrate a method for creating work orders for scheduling video tests, and remotely executing the video tests on user network connections in a background of an electronic device based on the created work orders. The following description provides specific details of certain aspects of the disclosure illustrated in the drawings to provide a thorough understanding of those aspects. It should be recognized, however, that the present disclosure can be reflected in additional aspects and the disclosure may be practiced without some of the details in the following description.
[0040] The various aspects including the example aspects are now described more fully with reference to the accompanying drawings, in which the various aspects of the disclosure are shown. The disclosure may, however, be embodied in different forms and should not be construed as limited to the aspects set forth herein. Rather, these aspects are provided so that this disclosure is thorough and complete, and fully conveys the scope of the disclosure to those skilled in the art. In the drawings, the sizes of components may be exaggerated for clarity.
[0041] Various aspects of the present disclosure provide a system and a method that is capable of automating and remotely executing video tests on a user network to enable organizations and individuals to continuously monitor and assess performance of communication networks remotely. In another aspect of the present disclosure, the system and the method describe operations for creating the work orders for remotely executing the video tests over the user network connections to facilitate the organizations to proactively manage a Quality of Service (QoS) of the communication networks and enhance user satisfaction by ensuring a consistent and reliable internet connectivity.
[0042] In the disclosure, various embodiments are described using terms used in some communication standards (e.g., 3rd Generation Partnership Project (3GPP)), but these are merely examples for description. Various embodiments of the disclosure may also be easily modified and applied to other communication systems.
[0043] In order to facilitate an understanding of the disclosed invention, a number of terms are defined below.
[0044] The term “stalling” refers to a video playback interruption caused by insufficient buffering.
[0045] The term “freezing ratio ” refers to a measurement of ratio of time the video remains frozen to the total video duration.
[0046] A “buffering time” refers to a measurement of a total time spent in loading the video before playback and during the playback.
[0047] The following description provides specific details of certain aspects of the disclosure illustrated in the drawings to provide a thorough understanding of those aspects. It should be recognized, however, that the present disclosure can be reflected in additional aspects and the disclosure may be practiced without some of the details in the following description.
[0048] Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings. FIG. 1 through FIG. 11, discussed below, and the one or more embodiments used to describe the principles of the present disclosure are by way of illustration only and should not be construed in any way to limit the scope of the present disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged system or device.
[0049] FIG. 1 illustrates a block diagram depicting an example system 100 for executing video tests over network connections, in accordance with an example embodiment of the present disclosure. The embodiment of the system 100 shown in FIG. 1 is for illustration only. Other embodiments of the system 100 may be used without departing from the scope of this disclosure.
[0050] As shown in FIG. 1, the system 100 includes electronic device(s) 110 (interchangeably referred to as “client device 110” or “ one or more client device(s) 110”), a network 120, a load balancer 130, a server 140 (interchangeably referred to as “remote server 140”), a database 150, a network management device 160, and a plurality of test servers 1 through N.
[0051] The electronic device 110 communicates with the server 140 via the network 120. In one or more embodiments, one or more applications are installed on the electronic device 110 to communicate with the server 140 and the test servers 1 through N. Example of the electronic device 110 may include, but not limited to, a Set Top Box (STB) equipment and a User Equipment (UE) such as, but not limited to, smartphones, tablets, laptops, desktop computers, and the like. For instance, the term “electronic device” or the “client device” used herein refers to the STB equipment that wirelessly accesses the server 140 for streaming media content. In a non-limiting example, the STB equipment may be connected to 4G/5G/beyond 5G/Wi-Fi/Ethernet through a Customer Premises Equipment (CPE) on Fixed Wireless Access (FWA) solution.
[0052] The network 120 enables transmission of messages and acts as a communication medium between components of the system 100. The network 120 may correspond to one of an Internet, a proprietary Internet Protocol (IP) network, or other data network. The network 120 may include suitable logic, circuitry, and interfaces that may be configured to provide several network ports and several communication channels for transmission and reception of data related to operations of various entities of the system 100. Each network port may correspond to a virtual address (or a physical machine address) for transmission and reception of the communication data. For example, the virtual address may be an Internet Protocol Version 4 (IPV4) (or an IPV6 address) and the physical address may be a Media Access Control (MAC) address. The network 120 may be associated with an application layer for implementation of communication protocols based on one or more communication requests from the various entities of the system 100. The communication data may be transmitted or received via the communication protocols. Examples of the communication protocols may include, but are not limited to, Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), Simple Mail Transfer Protocol (SMTP), Domain Network System (DNS) protocol, Common Management Interface Protocol (CMIP), Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Long Term Evolution (LTE) communication protocols, or any combination thereof.
[0053] In some aspects of the present disclosure, the communication data may be transmitted or received via at least one communication channel of several communication channels in the network 120. Examples of the communication channels may include, but are not limited to, a wireless channel, a wired channel, a combination of wireless and wired channel thereof. The wireless or wired channel may be associated with a data standard which may be defined by one of a Local Area Network (LAN), a Personal Area Network (PAN), a Wireless Local Area Network (WLAN), a Wireless Sensor Network (WSN), Wireless Area Network (WAN), Wireless Wide Area Network (WWAN), a metropolitan area network (MAN), a satellite network, the Internet, an optical fiber network, a coaxial cable network, an Infrared (IR) network, a Radio Frequency (RF) network, and a combination thereof. Aspects of the present disclosure are intended to include or otherwise cover any type of communication channel, including known, related art, and/or later developed technologies.
[0054] The load balancer 130 is an intermediary between the network 120 and the server 140. The load balancer 130 is configured to distribute, to the server 140, incoming requests from the network management device 160 for creating the work orders for scheduling the video tests on multiple electronic devices (not shown in FIG. 1). In one or more embodiments, the incoming requests for creating the work orders may be received from a plurality of network management devices (not shown in FIG. 1).
[0055] The server 140 may be a network of computers, a software framework, or a combination thereof, that may provide a generalized approach to create a server implementation. Examples of the server 140 may include, but are not limited to, personal computers, laptops, mini-computers, mainframe computers, any non-transient and tangible machine that can execute a machine-readable code, cloud-based servers, distributed server networks, or a network of computer systems. The server 140 may be realized through various web-based technologies such as, but not limited to, a Java web-framework, a .NET framework, a personal home page (PHP) framework, or any web-application framework. In other aspects of the present disclosure, the server 140 may be configured to perform one or more operations for creating work orders for scheduling video tests over the user network connections in the background of the electronic device 110, and determining overall network performance and video quality metrics based on execution of the video tests at the electronic device 110.
[0056] The database 150 may be configured to store data including, but not limited to, data collected as the results of the execution of the video tests, user profile data associated with the electronic device 110, and backend data associated with the network management device 160.
[0057] The network management device 160 corresponds to an electronic device used by network engineers or network operation team for managing and optimizing network resources. The network management device 160 may include an application interface to facilitate display of a plurality of options for receiving inputs for creation of the work orders and inputs related to the configurability of parameters for remotely executing the video tests. The application interface may be further configured to facilitate communications between the network management device 160 and the server 140 for creating the workorders for scheduling the video tests at the electronic device 110. The application interface may be further configured to send one or more requests to the server 140 and receive data communications from the server 140 via the network 120. The application interface may be further configured to cause the network management device 160 to output a signal instructing the server 140 to initiate a process for creating the work orders for scheduling the video tests at the electronic device 110.
[0058] Although FIG. 1 illustrates one example of the system 100 for remotely scheduling the video tests on the user network, various changes may be made to FIG. 1. For example, the system may include any number of electronic devices and servers in any suitable arrangement. Further, in another example, the system 100 may include any number of components in addition to the components shown in FIG. 1. Further, various components in FIG. 1 may be combined, further subdivided, or omitted and additional components may be added according to particular needs.
[0059] FIG. 2 illustrates an example system architecture of the electronic device 110, in accordance with an embodiment of the present disclosure. The embodiment of the system architecture of the of the electronic device 110 as shown in FIG. 2 is for illustration only. However, the electronic device 110 may come in a wide variety of configurations, and FIG. 2 does not limit the scope of the present disclosure to any particular system architecture of the electronic device 110.
[0060] As shown in FIG. 2, the electronic device 110 includes one or more processors 210 (hereinafter also referred to as “processor 210”), a memory 215, a transceiver module 220, an interface(s) 225, a processing Engine(s)/module(s) 230, and an adaptive bitrate video player 235. These components may be in electronic communication via one or more buses (e.g., communication bus 250). Depending on the network type, the term “electronic device” may refer to any electronic device such as “STB,” “mobile station,” “subscriber station,” “remote terminal,” “wireless terminal,” or “receive point,”. For the sake of convenience, the term “electronic device” used herein refers to an electronic device such as the UE or the STB that wirelessly accesses the server 140 via the network 120.
[0061] The one or more components of the electronic device 110 are communicatively coupled with the processor 210 (described below) to perform operations for executing the video tests over the network connection. The processor 210 may include various processing circuitry and configured to execute programs or computer readable instructions stored in the memory 215. The processor 210 may also include an intelligent hardware device including a general-purpose processor, such as, for example, and without limitation, a Central Processing Unit (CPU), an Application Processor (AP), a dedicated processor, or the like, a microcontroller, a Field-Programmable Gate Array (FPGA), a programmable logic device, a discrete hardware component, or any combination thereof. In some cases, the processor 210 may be configured to operate a memory array using a memory controller. In some cases, a memory controller may be integrated into the processor 210. The processor 210 may be configured to execute computer-readable instructions stored in a memory (e.g., the memory 215) to cause the electronic device 110 to perform various functions (e.g., schedule an execution of the video tests based on the work orders received from the remote server 140 and execute the video tests over the network connection at the scheduled time of execution of the video tests).
[0062] The memory 215 is communicatively coupled to the processor 210. A part of the memory 215 may include a Random-Access Memory (RAM), and another part of the memory 215 may include a flash memory or other Read-Only Memory (ROM). The memory 215 is configured to store a set of instructions required by the processor 210 for controlling overall operations of the electronic device 110. The memory 215 may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. In addition, the memory 215 may, in some examples, be considered a non-transitory storage medium. The "non-transitory" storage medium is not embodied in a carrier wave or a propagated signal. However, the term "non-transitory" should not be interpreted that the memory 215 is non-movable. In some examples, the memory 215 can be configured to store larger amounts of information. In certain examples, a non-transitory storage medium may store data that can, over time, change (e.g., in the RAM or cache). The memory 215 can be an internal storage unit or it can be an external storage unit of the electronic device 110, cloud storage, or any other type of external storage.
[0063] More specifically, the memory 215 may store computer-readable instructions including instructions that, when executed by a processor (e.g., the processor 210) cause the electronic device 110 to perform various functions described herein. In some cases, the memory 215 may contain, among other things, a BIOS which may control basic hardware or software operation such as the interaction with peripheral components or devices.
[0064] The transceiver module 220 may include one or more antennas, one or more of Radio Frequency (RF) transceivers, a transmit processing circuitry, and a receive processing circuitry. The transceiver module 220 may be configured to receive incoming signals, such as signals transmitted by the test servers 1 through N, the server 140, and the electronic device 110. The transceiver module 220 may down-convert the incoming signals to generate baseband signals which may be sent to the receiver processing circuitry. The receiver processing circuitry may transmit the processed baseband signals to the processor 210 for further processing. The transmit processing circuitry may receive analog or digital data from the processor 210 and may encode, multiplex, and/or digitize the outgoing baseband data to generate processed baseband signals. The transceiver module 220 may further receive the outgoing processed baseband from the transmit processing circuitry and up-converts the baseband signals to Radio Frequency (RF) signals that may be transmitted to the server 140 and the network management device 160.
[0065] The interface 225 may include suitable logic, circuitry, a variety of interfaces, and/or codes that may be configured to receive input(s) and present output(s) on the application interface of the electronic device 110. The variety of interfaces may include interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. For example, the I/O interface may have an input interface and an output interface. The interface 225 may facilitate communication of the electronic device 110 with various devices and systems connected to it. The interface 225 may also provide a communication pathway for one or more components of the electronic device 110. Examples of such components include, but are not limited to, the processing Engine(s)/module(s) 230.
[0066] In one or more embodiments, processing Engine(s)/module(s) 230 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the electronic device 110. In non-limiting examples, described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing Engine(s)/module(s) 230 may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processor 210 may comprise a processing resource (for example, one or more primary processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing Engine(s)/module(s) 230. In such examples, the electronic device 110 may also comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the electronic device 110 and the processing resource. In other examples, the processing Engine(s)/module(s) 230 may be implemented using an electronic circuitry.
[0067] In one or more embodiments, processing Engine(s)/module(s) 230 may include one or more units/modules such as an execution engine, a data aggregation engine, and an alarm manager (not shown).
[0068] In an embodiment, the processor 210 is configured to receive, from the server 140, one or more work orders (interchangeably referred to as “work orders”) to execute one or more video tests. In a non-limiting example, the work orders may include test parameters specifying a video¬ streaming application, Uniform Resource locators (URLs) of one or videos available in the video¬ streaming application, a streaming resolution of the one or more videos, a scheduled start time for executing the video test, a number of iterations, and a test duration for executing the video test.
[0069] The processor 210 may further be configured to parse the work orders received from the server 140 to extract the test parameters for executing the video tests, and thereafter schedule the video tests at the scheduled start time using the alarm manager. For instance, the processor 210 may trigger the execution of the video tests by sending an Application Programming Interface (API) request to the alarm manager of the electronic device 110.
[0070] In one or more embodiments, the scheduling of the video tests at the at the scheduled start time helps in ensuring that video streaming and downloading services are performed reliably on the user network at all times. Specifically, the video tests may be scheduled during peak hours to ensure that how user experiences the network on best quality and how the organization can determine whether the performance issues are time dependent. This helps in generating insights for the end user or the network engineers when most severe problem occurs during streaming of video content. In some embodiments, the processor 210 may be configured to reschedule the execution of the video in case the one or more work orders include an instruction to execute the video tests for more than one iteration.
[0071] Further, the processor 210 may be configured to retrieve video files from one or more application servers using the URLs of the one or videos included in the extracted test parameters. For instance, the video files may be retrieved from the one or more application servers based on host names indicated by the.
[0072] In an embodiment, the processor 210, using the execution engine, may be configured to execute the video tests over the network connection at the scheduled start time using the adaptive bitrate video player 235. The adaptive bitrate video player 235 may be a video streaming application that can dynamically adjusts the video quality based on the network condition of the electronic device 110.
[0073] For executing the video test at the scheduled start time, the execution engine of the electronic device 110 may perform a plurality of execution operations such as, but not limited to, streaming of the video files at a plurality of resolutions, detecting one or more video stalling events during streaming of the video files, calculating the video freezing ratio after streaming the video files based on total video freeze time and the video duration, and measuring a buffering time and a video load time during streaming of the video files. The video test may be performed based on a number of iterations specified in the work orders received from the server 140.
[0074] The video tests may be executed via the test servers 1 through N. The processor 210 may trigger the execution engine to select one of a test server from the test servers 1 through N, which is nearest to the electronic device 110 and execute the video test via the test server selected by the execution engine.
[0075] Further, the video tests are performed by the execution engine in background of the electronic device 110. The execution of the video test in the background of the electronic device 110 may refers to performing the video test as a non-intrusive process that does not interfere with primary functions of the electronic device 110. For example, if the STB is loading a web page while executing the video tests, the video tests run without disrupting the loading of the web page, ensuring that the user experience remains unaffected.
[0076] In some embodiments, the processor 210, using the execution engine, may re-execute the video tests if network fluctuations exceed a predefined threshold during the test duration of the video tests.
[0077] Furthermore, the processor 210, using the data aggregation engine, is configured to aggregate execution results of the video tests including the one or more of the video stalling events, the video freezing ratio, the buffering time, and the video load time. For instance, the video tests are executed by the execution engine of the electronic device, the data aggregation engine of the electronic device aggregates the obtained video execution results to generate a consolidated data set that reflects video quality metrics over multiple test iterations. The aggregation process may comprise collecting individual video quality metrics such as the video stalling events, the video freezing ratio, the buffering time, and the video load time from each video test execution cycle.
[0078] In an embodiment, the processor 210, using the transceiver module 220, is configured transmit the data including the aggregated execution results of the video test to the server 140 for generating a performance report indicating the video quality metrics.
[0079] Although FIG. 2 illustrates one example of the system architecture of the electronic device 110, various changes may be made to FIG. 2. Further, the electronic device 110 may include any number of components in addition to those shown in FIG. 2, without deviating from the scope of the present disclosure. For example, the electronic device 110 may further include circuitry, programing, applications, or a combination thereof. Further, various components in FIG. 2 may be combined, further subdivided, or omitted, and additional components may be added according to particular needs.
[0080] In an alternate embodiment, each engine/module of the processing Engine(s)/module(s) 230 may be configured to independently perform various operations of the processor 210, as described herein, without deviating from the scope of the present disclosure.
[0081] FIG. 3 illustrates a block diagram depicting a system architecture of the server 140, in accordance with an example embodiment of the present disclosure. The embodiment of the server 140 shown in FIG. 3 is for illustration only. Other embodiments of the server 140 may be used without departing from the scope of this disclosure.
[0082] The server 140 may include an Input-Output (I/O) interface 310, a memory 320, a data processing circuitry 330, a communication unit 350, a console host 360, and a database 370 coupled to each other via a first communication bus 380.
[0083] The I/O interface 310 may include suitable logic, circuitry, interfaces, and/or codes that may be configured to receive input(s) and present (or display) output(s) of the server 140. For example, the I/O interface 310 may have an input interface (not shown) and an output interface (not shown). The input interface may be configured to enable a user to provide input(s) to trigger (or configure) the server 140 to create the work orders for remotely scheduling the video tests at the electronic device 110. Aspects of the present disclosure are intended to include or otherwise cover any type of the input interface including known, related art, and/or later developed technologies without deviating from the scope of the present disclosure. The output interface may be configured to display (or present) output(s) to the user by the server 140. In some aspects of the present disclosure, the output interface may provide the output(s) based on instruction(s) provided via the input interface. Examples of the output interface of the I/O interface 310 may include, but are not limited to, a digital display, an analog display, a touch screen display, an appearance of a desktop, and/or illuminated characters.
[0084] The memory 320 may be configured to store logic, instructions 320A, circuitry, interfaces, and/or codes of the data processing circuitry 330 for executing various operations. The memory 320 may further be configured to store data associated with the work orders, that may be utilized by various data processing engines (or processor(s)) of the data processing circuitry 330 to create the work orders for remotely scheduling the video tests over user network connection of the electronic device 110. Aspects of the present disclosure are intended to include and/or otherwise cover any type of the data associated with the work orders, user profile, and performance data including the video quality metrics without deviating from the scope of the present disclosure. Examples of the memory 320 may include but are not limited to, a ROM, RAM, a flash memory, a removable storage drive, a Hard Disc Drive (HDD), a solid-state memory, a magnetic storage drive, a Programmable Read-Only Memory (PROM), the EPROM, and/or the EEPROM.
[0085] The data processing circuitry 330 may include processor(s) (such as data processing engines) configured with suitable logic, instructions, circuitry, interfaces, and/or codes for executing one or more operations of various operations performed by the server 140. For example, the data processing circuitry 330 is configured to execute programs and other processes stored in the memory 320. The data processing circuitry 330 is further configured to move data into or out of the memory 320 as required by the execution process. Examples of the data processing circuitry 330 may include, but are not limited to, an Application Specific integrated circuit (ASIC) processor, a Reduced Instruction Set Architecture (RISC) processor, a Complex Instruction Set Architecture (CISC) processor, an FPGA, and the like.
[0086] The data processing circuitry 330 may include data processor(s) (e.g., data processing engines) as shown in FIG. 3. According to an exemplary embodiment, the data processing circuitry 330 may include a display engine 330-1, a reception engine 330-2, a work order management engine 330-3, a report generation engine 330-4, a transmitting engine 330-5, and an execution engine 330-6 coupled to each other by way of a second communication bus 340.
[0087] In an embodiment, the display engine 330-1 is configured to control the application interface of the network management device 160 to display a work order window including the options for creating the one or more work orders. The reception engine 330-2 is configured to obtain inputs corresponding to the options in the work order window. The set of inputs may include, but not limited to, information of a scheduled start time for executing the video test, a test duration for executing the video test, a unique identifier associated with each electronic device among the one or more electronic devices 110, Uniform Resource locators (URLs) of one or more videos available in the one or more video¬ streaming applications, and a streaming resolution of the one or more videos.
[0088] In an embodiment, the work order management engine 330-3 is configured to create the one or more work orders related to the video tests based on the inputs obtained by the reception engine 330-2. Further, the work order management engine 330-3 may be configured to store the created one or more work orders in the database 370. Furthermore, the work order management engine 330-3 may be configured to retrieve the one or more work orders from the database 370 at the scheduled start time specified in the obtained inputs for creating the work order.
[0089] Further, the transmitting engine 330-5 may be configured to transmit the retrieved one or more work orders to each electronic device 110 over the network connection to execute the video test. Furthermore, the reception engine 330-2 may be configured to receive data including the execution results of the video test from each electronic device 110.
[0090] In an embodiment, the report generation engine 330-4 is configured to generate, based on the received data including the execution results of the video test, a performance report including the video quality metrics related to the execution of the video test. For instance, based on the data including the execution results of the video test , the report generation engine 330-4 generates the performance report including actionable insights into the video quality metrics during streaming of the video and a quality and a stability of the network connection during video playback at the electronic device 110.
[0091] In a non-liming example, a table within the performance report can represent these performance and video quality metrics in a structure format, allowing the end users or the network engineers to evaluate video playback quality and the network performance during the video tests. The table may include multiple columns representing different test execution parameters, such as the network type (example, 4G, 5G, Wi-Fi, etc.), execution timestamp, and specific video quality indicators. An exemplary table is shown below depicting an example of the performance report of the video tests:
Table 1
[0092] It should be understood that the above table is provide merely as a non-limiting example, and the performance report of the video tests may include additional or alternative metrics, indicators, or representations as required for specific implementations. The structure and contents of the table may be modified based on the network conditions, test parameters, or reporting preference without departing or deviating from the scope of the present disclosure.
[0093] In some embodiments, the execution engine 330-6 may be configured to detect bottlenecks in the user network based on a continuous analysis of video testing data and to assess how the server 140 can handle errors, such as network interruptions or server failures. This helps in optimizing the user network to avoid any bottlenecks. This also helps in assessing an ability of the user network to recover and continue content playback without disruptions.
[0094] In an embodiment, the execution engine 330-6 may determine a quality of the video content being streamed on the electronic device(s) 110 based on an analysis of the data including the execution results of the video tests. Further, the control, based on a current network status of the one or more electronic device(s) 110, the execution engine 330-6 may control one or more settings of the electronic device(s) 110 by sending a control signal to the electronic device(s) 110 to adjust the quality of the video content being streamed. The control may be performed based on a current network status of the electronic(s) devices 110.
[0095] In an embodiment, the report generation engine 330-4 may be further configured to determine a performance of the network connection with respect to one or more video parameters based on the execution results of the video tests. The one or more video parameters include the buffering time, a count of the video stalling events, the video freezing ratio, and the video load time associated with the videos on which the video tests are executed. The report generation engine 330-4 may also be configured to synchronize the execution results of the video tests in the database 370.
[0096] The communication unit 350 includes an electronic circuit specific to a standard that enables wired or wireless communication. The communication unit 350 is configured to communicate internally between internal hardware components and with external devices via one or more networks. The communication unit 350 may include a communication interface which may be configured to enable the server 140 to communicate with various entities of the system 100 (such as the electronic device 110, the network management device 160, and in some scenarios external network devices) through backhaul connection (e.g., wired backhaul or wireless backhaul) or a network. Examples of the communication unit 350 may include, but are not limited to, a modem, a network interface such as an Ethernet card, a communication port, and/or a Personal Computer Memory Card International Association (PCMCIA) slot and card, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, and a local buffer circuit. It will be apparent to a person of ordinary skill in the art that the communication unit 350 may include any device and/or apparatus capable of providing wireless or wired communications between the network management device 160 and the server 140 and other components of the system 100.
[0097] The console host 360 may include suitable logic, circuitry, interfaces, and/or codes that may be configured to enable the I/O interface 310 to receive input(s) and/or present output(s). In some aspects of the present disclosure, the console host 360 may include suitable logic, instructions, and/or codes for executing various operations of one or more computer executable applications to host the console on the network management device 160, by way of which the network engineer can trigger the server 140 to create the work orders and schedule the video tests on the electronic device 110. In some other aspects of the present disclosure, the console host 360 may provide a Graphical User Interface (GUI) for the server 140 for user interaction.
[0098] Various engines of the data processing circuitry 330 are presented to illustrate the functionality driven by the server 140. It will be apparent to a person having ordinary skill in the art that various engines of the data processing circuitry 330 are for illustrative purposes and not limited to any specific combination of hardware circuitry and/or software.
[0099] Although FIG. 3 illustrates one example of the system architecture of the server 140, various changes may be made to FIG. 3. Further, the server 140 may include any number of components in addition to those shown in FIG. 3, without deviating from the scope of the present disclosure. Further, various components in FIG. 3 may be combined, further subdivided, or omitted, and additional components may be added according to particular needs.
[0100] FIG. 4 illustrates a flowchart depicting a method 400 for creating the work orders for execution of the video tests over the network connections of the electronic device 110, in accordance with an exemplary embodiment of the present disclosure. The method 400 comprises a series of operation steps indicated by blocks 402 through 416.
[0101] Example blocks 402 through 416 of the method 400 are performed by one or more components of the server 140 as disclosed in FIG. 3, for creating the work orders and generating the performance report including the video quality metrics related to the execution of the video tests over the network connections of the electronic device 110. Although the method 400 shows the example blocks of operation steps 402 through 416, in some embodiments, the method 400 may include additional steps, fewer steps or steps in different order than those depicted in FIG. 4. In other embodiments, the steps 402 through 416 may be combined or may be performed in parallel. The method 400 starts at block 402.
[0102] At block 402, the display engine 330-1 controls the application interface of the network management device 160 to display the work order window including the options for creating the work orders for execution of the video tests over the network connection of the electronic device 110.
[0103] At block 404, the reception engine 330-2 obtains the inputs corresponding to the options displayed in the work order window. Some example UIs depicting the process to navigate to the work order window and obtaining the inputs corresponding to the options in the work order window is described below by referring to FIG. 5 through FIG. 9.
[0104] FIG. 5 illustrates an example UI 500 for navigating to a work order module, in accordance with an embodiment of the present disclosure. The UI 500 may include a cognitive platform having a plurality of options provided to an end user and/or the network engineer. Once the end user or the network engineer selects the option corresponding to the work order module, the display engine 330-1 controls the application interface of the network management device 160 to navigate to another page depicted by an example UI 500 (as shown in FIG. 5 described below).
[0105] FIG. 6 illustrates the example UI 600 for navigating to an application work order, in accordance with an embodiment of the present disclosure. Once the end user or the network engineer selects the option corresponding to the application work order shown in the UI 600, the display engine 330-1 controls the application interface of the network management device 160 to navigate to another page depicted by an example UI 700 (as shown in FIG. 7 described below).
[0106] FIG. 7 illustrates the example UI 700 including an option for creating the work orders for execution of the video tests at the electronic device 110, in accordance with an embodiment of the present disclosure. As shown in FIG. 7, the UI 700 provide an option 702 for creating the work orders for execution of the video tests at the electronic device 110. Once the end user or the network engineer selects the option 702 corresponding to the create work order shown in the example UI 700, the display engine 330-1 controls the application interface of the network management device 160 to navigate to another page depicted by an example UI 800 (as shown in FIG. 8 described below).
[0107] FIG. 8 illustrates the example UI 800 including options for receiving the user inputs to create the work orders for the execution of the video tests, in accordance with an embodiment of the present disclosure. The UI 800 provide multiple options to the end user or the network engineer for configuring a test script for executing the video tests. As shown in FIG. 8, the options may include, but not limited to, an option for selecting one or more video streaming applications, an option for selecting a number of iterations for executing the video test, an option for inputting the URLs of the one or videos available in the one or more video¬ streaming applications, an option for selecting a streaming resolution of the one or more videos, an option for setting the start time of the execution of the video test, and an option for setting the test duration for executing the video test.
[0108] In an embodiment, FIG. 9 illustrates an example UI 900 including options for receiving the user inputs to create the work order for one or more electronic devices using device Identifiers (IDs) of the electronic device 110, in accordance with an embodiment of the present disclosure. As shown in FIG. 9, the UI 900 provide multiple options to the end user or the network engineer for configuring the test script for executing the video tests. The options includes, but not limited to, an option for selecting a recipe test (i.e., the video test), the option for selecting the date and time period for executing the video test, the option for selecting the number of iterations for the video test, the option for setting the start time of the video test, the option for setting the test duration of the video test, an option for adding the electronic devices (i.e., STBs) manually by entering a unique device ID of each electronic device 110 or by uploading a file including a list of unique device IDs of the electronic devices 110. Once the end user or the network engineer selects all of these options and enters the device IDs of the electronic devices, the work order management engine 330-3 creates the test script (i.e., the work order) for executing the video tests and assign the created test script to the electronic devices 110 in accordance with their unique device IDs uploaded or entered by the network engineer for executing the video test. The unique device IDs may correspond to, but not limited to, Media Access Control (MAC) address based identifier, a universal unique identifier, a serial number based identifier, and the like. for example, the MAC address based identifier may be represented as AA : BB : CC : DD : EE : FF and the serial number based identifier may be represented as STB-2025-XYZ123456789.
[0109] In an embodiment, the work order management engine 330-3 identifies each electronic device 110 based on the respective unique device IDs of the electronic devices 110, and creates a user profile of the one or more users associated with each electronic device 110 based on the identification and backend data associated with the identified electronic devices.
[0110] Once the set of inputs is obtained by the reception engine 330-2, the flow of the method 400 proceeds to block 406.
[0111] At block 406, the work order management engine 330-3 creates the work orders related to the video test based on the obtained set of inputs.
[0112] Further, the work order management engine 330-3 stores the created work orders in the database 370 (at block 408), and retrieve the created work orders from the database 370 at the scheduled start time (at block 410).
[0113] Furthermore, at block 412, the transmitting engine 330-5 transmits, over the network connection, the retrieved work orders to each electronic device 110 to execute the video test. The retrieved workorders are transmitted to each electronic device 110 based on respective unique identifier associated with corresponding electronic device 110 in the created workorders.
[0114] Thereafter, at block 414, the reception engine 330-2 receives the data including the execution results of the video test from each electronic device 110 in response to the transmitted work orders.
[0115] Once the execution results of the video test is received by the reception engine 330-2 of the server 140 from each electronic device 110, then at block 416, the report generation engine 330-4 of the server 140 generates the performance report including the video quality metrics and indicators related to the execution of the video test using the video test data received from each electronic device 110.
[0116] In an embodiment, the display engine 330-1 may further be configured to provide an interface on the network management device 160 for downloading the generated performance report. FIG. 10 illustrates an example UI 1000 for downloading the performance report corresponding to the executed video tests, in accordance with an example embodiment of the present disclosure. For each of the executed video tests, the report generation engine 330-4 may collect test result and may generate the performance report for each of the executed video tests. The UI 1000 as shown in FIG. 10 depicts an example dashboard including a list of generated performance reports (for example, reports 1 through 8) along with an option using which the end user or the network engineer can download the performance report including all the measurement results and the video quality metrics generated based on the execution of the video tests.
[0117] FIG. 11 illustrates a flowchart depicting a method 1100 for executing the video tests over the network connection of the electronic device 110, in accordance with an exemplary embodiment of the present disclosure. The method 1100 comprises a series of operation steps indicated by blocks 1102 through 1114.
[0118] Example blocks 1102 through 1112 of the method 1100 are performed by one or more components of the electronic device 110 as disclosed in FIG. 2, for executing of the video tests over the network connections of the electronic device 110 based on the one or more work orders received from the server 140. Although the method 1100 shows the example blocks of operation steps 1102 through 1114, in some embodiments, the method 1100 may include additional steps, fewer steps or steps in different order than those depicted in FIG. 11. In other embodiments, the steps 1102 through 1112 may be combined or may be performed in parallel. The method 1100 starts at block 1102.
[0119] At block 1102, the transceiver module 220 receives the one or more work orders from the server 140 to execute the video tests.
[0120] At block 1104, the processor 210 parses the work orders received from the server 140 to extract the test parameters for executing the video tests.
[0121] At block 1106, the processor 210 schedules the video tests at the scheduled start time using the alarm manager. In particular, the processor 210 may trigger the execution of the video tests by sending the API request to the alarm manager of the electronic device 110.
[0122] At block 1108, the processor 210 retrieves the video files from the one or more application servers using the URLs of the one or videos included in the extracted test parameters.
[0123] At block 1110, the processor 210 executes the video tests over the network connection at the scheduled start time using the adaptive bitrate video player 235. The processor 210 executes the video tests iteratively based on the number of iterations specified in the work orders, and measures video quality metrics such as, but not limited to, the video stalling events during streaming of the video files, the video freezing ratio during the video playback, the buffering time, the video load time during streaming of the video, average bitrate, tested resolution, video playback success rate, and the like.
[0124] At block 1112, the processor 210 aggregates execution results of the video tests including the one or more of the video stalling events, the video freezing ratio, the buffering time, and the video load time, average bitrate, tested resolution, video playback success. In particular, the processor 210 consolidates all the execution results of the video tests.
[0125] At block 1114, the processor 210 transmits the data including the aggregated execution results of the video test to the server 140 for generating the performance report indicating the video quality metrics.
[0126] Embodiments of the present technology may be described herein with reference to flowchart illustrations of methods and systems according to embodiments of the technology, and/or procedures, algorithms, steps, operations, formulae, or other computational depictions, which may also be implemented as computer program products. In this regard, each block or step of the flowchart, and combinations of blocks (and/or steps) in the flowchart, as well as any procedure, algorithm, step, operation, formula, or computational depiction can be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions embodied in computer-readable program code. As will be appreciated, any such computer program instructions may be executed by one or more computer processors, including without limitation a general-purpose computer or special purpose computer, or other programmable processing apparatus to perform a group of operations comprising the operations or blocks described in connection with the disclosed methods.
[0127] Further, these computer program instructions, such as embodied in computer-readable program code, may also be stored in one or more computer-readable memory or memory devices (for example, the memory 215 or 320) that can direct a computer processor or other programmable processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory or memory devices produce an article of manufacture including instruction means which implement the function specified in the block(s) of the flowchart(s).
[0128] It will further be appreciated that the term “computer program instructions” as used herein refer to one or more instructions that can be executed by the one or more processors (for example, the processor 210 or data processing circuitry 330) to perform one or more functions as described herein. The instructions may also be stored remotely such as on a server, or all or a portion of the instructions can be stored locally and remotely.
[0129] One or more embodiments disclosed herein may provide one or more technical advantages and other advantages. The embodiments disclosed herein provide an efficient mechanism for automating and remotely executing the video tests on the user network, thereby can help in enabling organizations and individuals to continuously monitor and remotely assess the performance of the communication networks. Further, according the one or more embodiments disclosed herein, the video test is executed on the user network connection remotely and without any manual intervention, thus the disclosed method and system can facilitate the organizations to proactively manage the QoS of the communication networks and enhance user satisfaction by ensuring the consistent and reliable internet connectivity.
[0130] Further, certain embodiments disclosed herein can facilitate adaptive bitrate streaming to enable an adjustment in video quality based on a viewer's network conditions. Further, the video test running in the background ensures that the adaptive mechanism functions as intended.
[0131] Furthermore, in some embodiments, the system and method enables the execution of video tests to enhance the user experience and to deliver high-quality content that meets user expectations, by continuously monitoring and optimizing video streaming service.
[0132] Moreover, the disclosed system and method measures network performance to enable analysis of the data collected as a result of the execution of the video tests and logging and reporting metrics related to the network speed, video quality, and any performance issues. This information helps in identifying and addressing network performance issues. Also, the disclosed system and method can simulate realistic usage scenarios such as multiple users streaming content simultaneously or variations in network conditions by scheduling background tests to mimic unpredictability of real-world situations
[0133] Additionally, the disclosed system and method collects data associated with the performed video tests and analyze the collected data to identify metrics related to the video quality. Based on the analysis, the system and method can efficiently identify any performance issues that may be logged and reported to the end user or the network engineer. This information may also help the network engineers in identifying and addressing problems associated with the network connection.
[0134] Certain embodiments of the present disclosure may facilitate a field test engineer in real time identification and analysis of the performance of the user network without carrying any system, facilitating real time troubleshooting of network issues, facilitating monitoring of geography wise network performance, and enabling consumers to select a correct network operator based on the network coverage.
[0135] Those skilled in the art will appreciate that the methodology described herein in the present disclosure may be carried out in other specific ways than those set forth herein in the above disclosed embodiments without departing from essential characteristics and features of the present invention. The above-described embodiments are therefore to be construed in all aspects as illustrative and not restrictive.
[0136] The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Any combination of the above features and functionalities may be used in accordance with one or more embodiments.
[0137] In the present disclosure, each of the embodiments has been described with reference to numerous specific details which may vary from embodiment to embodiment. The foregoing description of the specific embodiments disclosed herein may reveal the general nature of the embodiments herein that others may, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications are intended to be comprehended within the meaning of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and is not limited in scope.
LIST OF REFERENCE NUMERALS
[0138] The following list is provided for convenience and in support of the drawing figures and as part of the text of the specification, which describe innovations by reference to multiple items. Items not listed here may nonetheless be part of a given embodiment. For better legibility of the text, a given reference number is recited near some, but not all, recitations of the referenced item in the text. The same reference number may be used with reference to different examples or different instances of a given item. The list of reference numerals is:
100 - System
110 - Electronic device
120 - Network
130 - Load balancer
140 - Server
150 - Database
160 - Network Management Device
1 through N - Test servers
210 - Processor
215 - Memory
220 - Transceiver module
225 - Interface(s)
230 - Processing Engine(s)/module(s)
235 – Adaptive bitrate video player
250 - Communication bus
310 - Input-Output (I/O) interface
320 - Memory
320A - Instructions
330 - Data processing circuitry
330-1 - Display engine
330-2 - Reception engine
330-3 - Work order management engine
330-4 - Report generation engine
330-5 - Transmitting engine
330-6 – Execution Engine
340 - Second communication bus
350 - Communication Unit
360 - Console host
370 - Database
380 - First communication bus
400 - Method for creating the work orders for execution of the video tests
500 - Example UI for navigating to a work order module
600 - Example UI for navigating to an application work order
700 - Example UI including option for creating the work orders
702 - Option for creating the work order
800 - Example UI including options for receiving user inputs
900 – Another example UI including options for receiving user inputs
1000 - Example UI for downloading the performance report
1100 - Method for executing the video tests over the network connection at the electronic device
,CLAIMS:We claim:
1. A method (400) for executing a video test over a network connection, the method comprising:
controlling, by a display engine (330-1), an application interface to display a work order window including a plurality of options for creating at least one work order;
obtaining, by a reception engine (330-2), a set of inputs corresponding to the plurality of options in the work order window, wherein the set of inputs includes information of a scheduled start time for executing the video test and a test duration for executing the video test;
creating, by a work order management engine (330-3) based on the obtained set of inputs, the at least one work order related to the video test;
storing, by the work order management engine (330-3), the at least one work order in a database (370);
retrieving, by the work order management engine (330-3), the at least one work order from the database (370) at the scheduled start time;
transmitting, by a transmitting engine (330-5) over the network connection, the retrieved at least one work order to one or more client devices (110) to execute the video test;
receiving, by the reception engine (330-2), data including execution results of the video test from the one or more client devices (110); and
generating, by a report generation engine (330-4) based on the received data, a performance report including performance metrics and indicators related to the execution of the video test.
2. The method (400) as claimed in claim 1, wherein
the set of inputs further includes a unique identifier associated with each client device among the one or more client devices (110),
the retrieved at least one work order is transmitted to the one or more client devices (110) based on the unique identifier associated with corresponding client device (110), and
the video test is executed at the one or more client devices (110) over the network connection till completion of the test duration for each iteration.
3. The method (400) as claimed in claim 1, wherein
a quality of video content being streamed on the one or more client devices (110) is determined based on an analysis of the data including execution results of the video test, and
the method comprises controlling, by an execution engine (330-6) based on a current network status of the one or more client devices (110), one or more settings of the one or more client devices (110) by sending a control signal to the one or more client devices (110) to adjust the quality of the video content being streamed.
4. The method (400) as claimed in claim 1, wherein the plurality of options includes at least an option for selecting one or more video streaming applications, an option for selecting a number of iterations for executing the video test, an option for inputting Uniform Resource locators (URLs) of one or videos available in the one or more video¬ streaming applications, an option for selecting a streaming resolution of the one or more videos, an option for setting the start time of the execution of the video test, and an option for setting the test duration for executing the video test.
5. The method (400) as claimed in claim 1,
wherein, based on the data including the execution results of the video test, a performance of the network connection is determined by the report generation engine (330-4) with respect to one or more video parameters, and
wherein the one or more video parameters include at least a buffering time, a count of one or more video stalling events, a video freezing ratio, and a video load time associated with one or videos on which the video test is executed.
6. The method (400) as claimed in claim 1, wherein
the execution results of the executed video test is synchronized in the database (370) by the report generation engine (330-4), and
the video test is executed in background of the one or more client devices (110) using a video streaming application.
7. The method (110) as claimed in claim 1, wherein the video test is re-executed at the client device (110) if network fluctuations exceed a predefined threshold during the test duration of the video test.
8. A system (140) for executing a video test over a network connection, the system (140) comprising:
a communication interface (350) configured to establish a connection with a network management device (160) and a client device (110); and
a display engine (330-1) configured to control an application interface of the network management device (160) to display a work order window including a plurality of options for creating at least one work order;
a reception engine (330-2) configured to obtain a set of inputs corresponding to the plurality of options in the work order window, wherein the set of inputs includes information of a scheduled start time for executing the video test and a test duration for executing the video test;
a work order management engine (330-3) configured to:
create the at least one work order related to the video test based on the obtained set of inputs;
store the at least one work order in a database (370); and
retrieve the at least one work order from the database (370) at the scheduled start time;
a transmitting engine (330-5) configured to transmit, over the network connection, the retrieved at least one work order to one or more client devices (110) to execute the video test, wherein the reception engine (330-2) is further configured to receive data including execution results of the video test from the one or more client devices (110); and
a report generation engine (330-4) configured to generate, based on the received data, a performance report including performance metrics and indicators related to the execution of the video test.
9. The system (140) as claimed in claim 8, wherein
the set of inputs further includes a unique identifier associated with each client device among the one or more client devices,
the retrieved at least one workorder is transmitted to the one or more client devices (110) based on the unique identifier associated with corresponding client device (110), and
the video test is executed at the one or more client devices (110) over the network connection till the completion of the test duration for each iteration.
10. The system (140) as claimed in claim 8,
wherein a quality of video content being streamed on the one or more client devices is determined based on an analysis of the data including the execution results of the video test, and
the system further comprises an execution engine (330-6) configured to control, based on a current network status of the one or more client devices, one or more settings of the one or more client devices (110) by sending a control signal to the one or more client devices (110) to adjust the quality of the video content being streamed.
11. The system (140) as claimed in claim 8, wherein the plurality of options includes an option for selecting one or more video streaming applications, an option for selecting a number of iterations for executing the video test, an option for inputting Uniform Resource locators (URLs) of one or videos available in the one or more video¬ streaming applications, an option for selecting a streaming resolution of the one or more videos, an option for setting the start time of the execution of the video test, and an option for setting the test duration for executing the video test.
12. The system (140) as claimed in claim 8,
wherein the report generation engine (330-4) is further configured to determine, based on the execution results of the video test, a performance of the network connection with respect to one or more video parameters, and
wherein the one or more video parameters include at least a buffering time, a count of one or more video stalling events, a video freezing ratio, and a video load time associated with one or videos on which the video test is executed.
13. The system (140) as claimed in claim 8, wherein the report generation engine (330-4) is further configured to synchronize the execution results of the video test in the database (370), wherein the video test is executed in background of the one or more client devices (110) using a video streaming application.
14. The system (140) as claimed in claim 8, wherein the video test is re-executed at the client device (110) if network fluctuations exceed a predefined threshold during the test duration of the video test.
15. An electronic device (110), comprising:
one or more primary processors (210) communicatively coupled to one or more processors, the one or more primary processors (210) are coupled with a memory (215), wherein the memory (215) stores instructions which when executed by the one or more primary processors (210) causes the electronic device (110) to:
receive, from a server (140), at least one work order to execute a video test, wherein the work order includes test parameters specifying a video¬ streaming application (235), Uniform Resource locators (URLs) of one or videos available in the video¬ streaming application, a streaming resolution of the one or more videos, a scheduled start time for executing the video test, a number of iterations, and a test duration for executing the video test;
parse the received at least one work order to extract the test parameters for executing the video test;
schedule the video test at the scheduled start time using an alarm manager;
retrieve at least one video file using the URLs of the one or videos included in the extracted test parameters;
execute, using the video¬ streaming application (235), the video test at the scheduled start time by performing a plurality of execution operations that includes:
stream the at least one video file at a plurality of resolutions;
detect one or more video stalling events during streaming of the at least one video file;
calculate a video freezing ratio after streaming the at least one video file using total video freeze time and video duration; and
measure a buffering time and a video load time during streaming of the at least one video file;
aggregate execution results of the video test including the one or more video stalling events, the video freezing ratio, the buffering time, and the video load time; and
transmit data including the aggregated execution results of the video test to the server (140) for generating a performance report.
| # | Name | Date |
|---|---|---|
| 1 | 202421030603-STATEMENT OF UNDERTAKING (FORM 3) [16-04-2024(online)].pdf | 2024-04-16 |
| 2 | 202421030603-PROVISIONAL SPECIFICATION [16-04-2024(online)].pdf | 2024-04-16 |
| 3 | 202421030603-POWER OF AUTHORITY [16-04-2024(online)].pdf | 2024-04-16 |
| 4 | 202421030603-FORM 1 [16-04-2024(online)].pdf | 2024-04-16 |
| 5 | 202421030603-DRAWINGS [16-04-2024(online)].pdf | 2024-04-16 |
| 6 | 202421030603-DECLARATION OF INVENTORSHIP (FORM 5) [16-04-2024(online)].pdf | 2024-04-16 |
| 7 | 202421030603-Proof of Right [09-08-2024(online)].pdf | 2024-08-09 |
| 8 | 202421030603-Request Letter-Correspondence [25-02-2025(online)].pdf | 2025-02-25 |
| 9 | 202421030603-Power of Attorney [25-02-2025(online)].pdf | 2025-02-25 |
| 10 | 202421030603-Form 1 (Submitted on date of filing) [25-02-2025(online)].pdf | 2025-02-25 |
| 11 | 202421030603-Covering Letter [25-02-2025(online)].pdf | 2025-02-25 |
| 12 | 202421030603-FORM 18 [10-04-2025(online)].pdf | 2025-04-10 |
| 13 | 202421030603-DRAWING [10-04-2025(online)].pdf | 2025-04-10 |
| 14 | 202421030603-CORRESPONDENCE-OTHERS [10-04-2025(online)].pdf | 2025-04-10 |
| 15 | 202421030603-COMPLETE SPECIFICATION [10-04-2025(online)].pdf | 2025-04-10 |
| 16 | Abstract-1.jpg | 2025-05-17 |