Abstract: A method for delivering an update to connected devices (102A-N) includes selecting test cases (404A-E) from a test repository (118) for testing stability of an updated version (106) of a product. A corresponding priority (402) to be assigned to each of the test cases (404A-E) is determined based on a corresponding variable score (302) assigned to each of features associated with an older version (112) and/or the updated version (106). A sample set of connected devices (110A-N) including the updated version (106) is subjected to a stability test by executing the test cases (404A-E) on the sample set of connected devices (110A-N) for a designated time. A stability value associated with the updated version (106) is computed based on stability values of the sample set of connected devices (110A-N) and the test cases (404A-E) for automatically delivering the updated version (106) to at least one of the connected devices (102A-N).
Claims:
1. A method for delivering an update to a plurality of connected devices (102A-N), the method comprising:
selecting a plurality of test cases (404A-E) to be executed from a test repository (118) for testing stability of an updated version (106) of a product to be installed in at least one of the plurality of connected devices (102A-N), wherein the plurality of test cases (404A-E) are selected based on historical usage information of one or more features associated with an older version (112) of the product, one or more features associated with the updated version (106) of the product, or a combination thereof;
determining a corresponding priority (402) to be assigned to each of the selected test cases (404A-E) that are related to testing one or more of the features associated with the older version (112) and the features associated with the updated version (106) based on a corresponding variable score (302) assigned to each of the features associated with at least one of the older version (112) and the updated version (106);
subjecting a sample set of connected devices (110A-N) comprising the updated version (106) of the product to a stability test by executing the selected test cases (404A-E) at least once on the sample set of connected devices (110A-N) for a designated time;
determining a corresponding stability value (602) associated with each of the sample set of connected devices (110A-N) upon executing each of the selected test cases (404A-E) on the sample set of connected devices (110A-N);
determining a corresponding stability value (604) associated with each of the selected test cases (404A-E) based on the corresponding stability value (602) associated with each of the sample set of connected devices (110A-N), the corresponding priority assigned to each of the selected test cases (404A-E), and a total number of the sample set of connected devices (110A-N) that are being subjected to the stability test;
computing a stability value associated with the updated version (106) of the product based on the corresponding stability value (604) associated with each of the selected test cases (404A-E); and
automatically delivering the updated version (106) of the product to one or more of the plurality of the connected devices (102A-N) upon determining that the computed stability value of the updated version (106) exceeds a stability value associated with the older version (112) of the product.
2. The method as claimed in claim 1, further comprising:
identifying one or more defects that occur upon executing the selected test cases (404A-E) on the sample set of connected devices (110A-N) and a corresponding time between the start of the stability test and corresponding occurrences of each of the identified defects; and
determining a corresponding variable severity to be assigned to each of the identified defects depending upon corresponding types of the identified defects based on one or more designated rules.
3. The method as claimed in claim 2, wherein the corresponding variable severity is assigned for a defect selected from the identified defects based on one or more of a number of times the defect was reported by a plurality of users in the past when using a feature associated with the older version (112) of the product, reporting of the same defect from at least two test cases selected from the selected test cases (404A-E) during the stability test, repeated occurrences of the same defect when executing a test case selected from the selected test cases (404A-E) more than once during the stability test, and reporting of the defect from one or more features that are newly added in the updated version (106) and the features associated with the older version (112) that have been modified in the updated version (106).
4. The method as claimed in claim 3, further comprising determining the corresponding stability value (602) associated with a designated device selected from the sample set of connected devices (110A-N) when a designated test case from the selected test cases (404A-E) fails on the designated device based on the corresponding priority (402) associated with the designated test case, the corresponding time between the start of the stability test and an occurrence of a defect that is reported following an execution of the designated test case, the corresponding variable severity associated with the reported defect, and a total duration of the stability test.
5. The method as claimed in claim 1, wherein the historical usage information of the one or more features associated with the older version (112) of the product comprises historical information related to most frequently used features by a plurality of users in the past and most frequently used features identified in stability issues previously reported by the plurality of users in the past.
6. The method as claimed in claim 5, wherein the one or more features associated with the updated version (106) of the product comprise one or more features that are newly added in the updated version (106) and one or more features associated with the older version (112) that have been modified in the updated version (106).
7. The method as claimed in claim 6, further comprising selecting the plurality of test cases (404A-E) to be executed for testing the stability of the updated version (106) of the product based on one or more features that are available in both the older version (112) and the updated version (106).
8. The method as claimed in claim 7, further comprising assigning the corresponding variable score (302) to each of the most frequently used features, the most frequently used features identified in previously reported stability issues, the features that are newly added in the updated version (106), the features associated with the older version (112) that have been modified in the updated version (106), and the features that are available in both the older version (112) and the updated version (106) based on one or more designated rules.
9. A delivery system (100), comprising:
a sample set of connected devices (110A-N) that are representative of a plurality of connected devices (102A-N) and having an updated version (106) of a product to be subjected to a stability test before releasing the updated version (106) to at least one device selected from the plurality of connected devices (102A-N); and
a test automation system (108) that is communicatively coupled to the sample set of connected devices (110A-N) via a communication network (114) to execute the stability test, wherein the test automation system (108) is configured to:
select a plurality of test cases (404A-E) to be executed from a test repository (118) for testing stability of the updated version (106) of the product to be installed in at least one of the plurality of connected devices (102A-N), wherein the plurality of test cases (404A-E) are selected based on historical usage information of one or more features associated with an older version (112) of the product, one or more features associated with the updated version (106) of the product, or a combination thereof;
determine a corresponding priority (402) to be assigned to each of the selected test cases (404A-E) that are related to testing one or more of the features associated with the older version (112) and the features associated with the updated version (106) based on a corresponding variable score (302) assigned to each of the features associated with at least one of the older version (112) and the updated version (106);
execute the selected test cases (404A-E) at least once on the sample set of connected devices (110A-N) for a designated time;
determine a corresponding stability value (602) associated with each of the sample set of connected devices (110A-N) upon executing each of the selected test cases (404A-E) on the sample set of connected devices (110A-N);
determine a corresponding stability value (604) associated with each of the selected test cases (404A-E) based on the corresponding stability value (602) associated with each of the sample set of connected devices (110A-N), the corresponding priority assigned to each of the selected test cases (404A-E), and a total number of the sample set of connected devices (110A-N) that are being subjected to the stability test; and
compute a stability value associated with the updated version (106) of the product based on the corresponding stability value (604) associated with each of the selected test cases (404A-E), wherein the updated version (106) of the product is delivered to one or more of the plurality of the connected devices (102A-N) upon determining that the computed stability value of the updated version (106) exceeds a stability value associated with the older version (112) of the product.
10. The delivery system (100) as claimed in claim 9, wherein the delivery system (100) further comprises a defect analyzing system (119) that is configured to identify one or more defects that occur upon executing the selected test cases (404A-E) on the sample set of connected devices (110A-N) and a corresponding time between the start of the stability test and corresponding occurrences of each of the identified defects.
11. The delivery system (100) as claimed in claim 9, wherein the updated version (106) of the product is stored in a server (104) that is communicatively coupled to the test automation system (108) and the plurality of the connected devices (102A-N) via the communication network (114) and is configured to deliver the updated version (106) to one or more of the plurality of the connected devices (102A-N).
12. The delivery system (100) as claimed in claim 9, wherein the server (104) corresponds to an over-the-air server that is configured to deliver the updated version of the product to at least one of the plurality of connected devices (102A-N), the plurality of connected devices (102A-N) comprising one or more of mobiles phones, set-top-boxes, portable computing devices, imaging devices, communication devices, and equipment control units.
, Description:
BACKGROUND
[0001] Embodiments of the present specification relate generally to delivering updates to connected devices. More particularly, the present specification relates to a system and method for automatically delivering updates to connected devices based on a corresponding stability index.
[0002] A product development process requires maintaining the right balance between quality of a product and the time to bring the product to market. For example, one of the goals of a software development process may include improving functionalities of an existing software product by introducing new functionalities through release of a newer version of the software product.
[0003] Continuous integration and continuous delivery are two key processes that are typically used during development of the software product to improve a time to market the software product. Continuous integration focuses on integrating work from individual developers into a main repository multiple times in a selected period to identify integration bugs promptly and to accelerate collaborative development. Continuous delivery is an extension of continuous integration and relates to reducing friction in the release process by automating the steps required to deploy a software build so that code associated with the software product can be released safely at any time.
[0004] Though the time to market a product and availability of features associated with the product are competitive aspects by themselves, the success of the product is also influenced by the quality of the product. One way for identifying the quality of the product before the release is by subjecting the product to a stability test for assessing stability of the product. The stability test defines how the product functions over a specific period of time upon operating the product, for example, to its full capacity. The stability test is generally performed to verify the capacity of the product to work beyond a normal capacity and to understand how the product would work in real-life situations. If the product continues to operate normally without any failure for a longer than expected duration, the quality of the product may be considered to be high.
[0005] Typical statistics collected to determine the stability vary according to nature of a product under test and test procedures. Additionally, interpretation of test results varies according to the person who interprets it. A subjective measure of the stability may not provide full confidence regarding the product release decision. In addition, the absence of an objective measurement of the stability makes it difficult to compare various versions of the product, and identify a change in the stability across different versions. Hence, there is a need for objectively assessing the stability and automating the product release decision.
[0006] Various stability-testing methods have been developed to objectively determine the stability of the product. Histograms, reliability demonstrations, control charts, and sequential probability ratio test (SPRT) are few exemplary devices that have been conventionally used to assist in a determination of the stability of a product. However, conventional stability tests require a long time, and hence, certain organizations may skip performing the stability tests, leading to uninformed calls as to when to release the product. Hence, there is a need for an improved system and method to address the aforementioned needs and issues.
BRIEF DESCRIPTION
[0007] It is an objective of the present disclosure to provide a method for delivering an update to a plurality of connected devices. The method includes selecting a plurality of test cases to be executed from a test repository for testing stability of an updated version of a product to be installed in at least one of the plurality of connected devices. The plurality of test cases are selected based on historical usage information of one or more features associated with an older version of the product, one or more features associated with the updated version of the product, or a combination thereof. A corresponding priority to be assigned to each of the selected test cases that are related to testing one or more of the features associated with the older version and the features associated with the updated version is determined. The corresponding priority to be assigned to each of the selected test cases is determined based on a corresponding variable score assigned to each of the features associated with at least one of the older version and the updated version.
[0008] A sample set of connected devices including the updated version of the product is subjected to a stability test by executing the selected test cases at least once on the sample set of connected devices for a designated time. The corresponding stability value associated with each of the sample set of connected devices is determined upon executing each of the selected test cases on the sample set of connected devices. The corresponding stability value associated with each of the selected test cases is determined based on the corresponding stability value associated with each of the sample set of connected devices, the corresponding priority assigned to each of the selected test cases, and a total number of the sample set of connected devices that are being subjected to the stability test.
[0009] A stability value associated with the updated version of the product is computed based on the corresponding stability value associated with each of the selected test cases. The updated version of the product is delivered to one or more of the plurality of the connected devices upon determining that the computed stability value of the updated version exceeds a stability value associated with the older version of the product. The one or more defects that occur upon executing the selected test cases on the sample set of connected devices and a corresponding time between the start of the stability test and corresponding occurrences of each of the identified defects may be identified. A corresponding variable severity to be assigned to each of the identified defects may be determined depending upon corresponding types of the identified defects based on one or more designated rules.
[0010] The corresponding variable severity may be assigned for a defect selected from the identified defects based on a number of times the defect was reported by a plurality of users in the past when using a feature associated with the older version of the product. The corresponding variable severity may be assigned for a defect selected from the identified defects based on reporting of the same defect from at least two test cases selected from the selected test cases during the stability test. Further, the corresponding variable severity may be assigned for a defect selected from the identified defects based on repeated occurrences of the same defect when executing a test case selected from the selected test cases more than once during the stability test. In addition, the corresponding variable severity may be assigned for a defect selected from the identified defects based on reporting of the defect from one or more features that are newly added in the updated version and the features associated with the older version that have been modified in the updated version.
[0011] The corresponding stability value associated with a designated device selected from the sample set of connected devices may be determined when a designated test case from the selected test cases fails on the designated device based on the corresponding priority associated with the designated test case. The corresponding stability value associated with the designated device may be determined also based on the corresponding time between the start of the stability test and an occurrence of a defect that is reported during execution of the designated test case, the corresponding variable severity associated with the reported defect, and a total duration of the stability test. The historical usage information of the one or more features associated with the older version of the product may include historical information related to most frequently used features by a plurality of users in the past and most frequently used features identified in stability issues previously reported by the plurality of users in the past.
[0012] The one or more features associated with the updated version of the product may include one or more features that are newly added in the updated version and one or more features associated with the older version that have been modified in the updated version. The plurality of test cases to be executed for testing the stability of the updated version of the product may be selected based on one or more features that are available in both the older version and the updated version. The corresponding variable score may be assigned to each of the most frequently used features and the most frequently used features identified in previously reported stability issues based on one or more designated rules. The corresponding variable score may also be assigned to each of the features that are newly added in the updated version, the features associated with the older version that have been modified in the updated version, and the features that are available in both the older version and the updated version based on one or more designated rules.
[0013] It is another objective of the present disclosure to provide a delivery system. The delivery system includes a sample set of connected devices and the test automation system. The sample set of connected devices are representative of a plurality of connected devices and having an updated version of a product to be subjected to a stability test before releasing the updated version to at least one device selected from the plurality of connected devices. The test automation system is communicatively coupled to the sample set of connected devices via a communication network to execute the stability test. The test automation system is configured to select a plurality of test cases to be executed from a test repository for testing stability of the updated version of the product to be installed in at least one of the plurality of connected devices. The plurality of test cases are selected based on historical usage information of one or more features associated with an older version of the product, one or more features associated with the updated version of the product, or a combination thereof.
[0014] The test automation system determines a corresponding priority to be assigned to each of the selected test cases that are related to testing one or more of the features associated with the older version and the features associated with the updated version. The corresponding priority to be assigned to each of the selected test cases is determined based on a corresponding variable score assigned to each of the features associated with at least one of the older version and the updated version. Further, the test automation system executes the selected test cases at least once on the sample set of connected devices for a designated time. Additionally, the test automation system determines a corresponding stability value associated with each of the sample set of connected devices upon executing each of the selected test cases on the sample set of connected devices. The test automation system determines a corresponding stability value associated with each of the selected test cases based on the corresponding stability value associated with each of the sample set of connected devices, the corresponding priority assigned to each of the selected test cases, and a total number of the sample set of connected devices subjected to the stability test.
[0015] Moreover, the test automation system computes a stability value associated with the updated version of the product based on the corresponding stability value associated with each of the selected test cases. The updated version of the product is delivered to one or more of the plurality of the connected devices upon determining that the computed stability value of the updated version exceeds a stability value associated with the older version of the product. The delivery system may further include a defect analyzing system that is configured to identify one or more defects that occur upon executing the selected test cases on the sample set of connected devices and a corresponding time between the start of the stability test and corresponding occurrences of each of the identified defects. The updated version of the product is stored in a server that is communicatively coupled to the test automation system and the plurality of the connected devices via the communication network. The server may be configured to deliver the updated version to one or more of the plurality of the connected devices. The server may correspond to one or more of a mobile phone manufacturer’s server and an over-the-air server that are configured to deliver the updated version of the product to at least one of the plurality of connected devices including mobiles phones and set-top-boxes respectively.
DRAWINGS
[0016] These and other features, aspects, and advantages of the claimed subject matter will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0017] FIG. 1 is a system diagram that depicts an exemplary delivery system for automatically delivering a product update to one or more connected devices;
[0018] FIGS. 2A and 2B are flow diagrams that depict an exemplary method for delivering the product update to the one or more connected devices of FIG. 1;
[0019] FIG. 3 is a table view that depicts a corresponding variable score assigned to each of a plurality of features of the product to be tested during a stability test using the exemplary delivery system of FIG. 1;
[0020] FIG. 4 is another table view that depicts a corresponding priority determined for an exemplary set of selected test cases based on the corresponding variable score assigned to the features being tested in the selected test cases;
[0021] FIG. 5 is a table view that depicts an exemplary summary of test results associated with an execution of the set of selected test cases of FIG. 4; and
[0022] FIG. 6 is a table view that depicts a corresponding stability value computed for each of the exemplary set of selected test cases of FIG. 4 using an embodiment of the method depicted in FIGS. 2A and 2B.
DETAILED DESCRIPTION
[0023] The following description presents exemplary systems and methods for automatically delivering product updates to one or more connected devices over one or more communication networks. Particularly, embodiments described herein disclose systems and methods for computing a stability value of an updated version of a product being developed to identify the quality of the updated version and for taking a decision to automatically release the updated version to one or more connected devices.
[0024] In one embodiment, the present systems and methods enable computing the stability value of the updated version by subjecting the updated version of the product to a stability test. A stability test refers to a test that determines if the product functions over a specific period of time without failure upon operating the product to its full capacity. The present systems and methods enable automatic delivery of the updated version of the product to various connected devices over one or more communication networks when the computed stability value of the updated version exceeds a designated threshold value.
[0025] It may be noted that different embodiments of the present systems may be used to compute stability values associated with various types of products, for example, software products, hardware products, and firmware products. For example, the present delivery system may be used to compute a stability value associated with an updated version of a software that is to be delivered to a plurality of printing devices that operate based on an older version of the software. Similarly, in another example, the delivery system may be used to compute a stability value associated with an updated version of an embedded device for automatically taking a decision on releasing the updated version of the embedded device into a market. Though the delivery system is capable of computing stability values associated with various types of products, for clarity of explanation, an embodiment of the delivery system depicted in in FIGS. 1-6 will be described herein with reference to computing a stability value of an updated firmware to be installed on set-top-boxes.
[0026] FIG. 1 depicts an exemplary delivery system (100) for delivering an updated version (106) of a software or a firmware to one or more connected devices (102A-N) from a server (104) over a communication network (114). In one embodiment, each of the one or more connected devices (102A-N) include an older version (106) of the software or the firmware that needs to be replaced with the updated version (106) to mitigate any technical issues associated with the older version (106) and/or to provide new functionalities. However, as the firmware update or the software update may affect the availability and functionality of the connected devices (102A-N), the delivery system (100) is configured to selectively deliver the firmware update from the server (104) to the one or more connected devices (102A-N) only post completion of necessary testing, validation and verification. To that end, the delivery system (100) is provided with a test automation system (108), and a sample set of connected devices (110A-N).
[0027] In certain embodiments, the one or more connected devices (102A-N) are associated with a plurality of end users (not shown in FIG. 1). Examples of the one or more connected devices (102A-N) include a cellular phone, a laptop, a tablet-computing device, a desktop computer, a set-top-box, a customer premise equipment, and/or any other processor enabled devices. As previously noted, each of the one or more connected devices (102A-N) includes the older version (112) of the firmware that needs to be updated. Furthermore, each of these connected devices (102A-N) is communicatively coupled to the server (104) that is configured to store the updated version (106) of the firmware. In an embodiment, where the connected devices (102A-N) include mobile phones, the server (104) corresponds to a phone manufacturer’s server that is configured to store and provide an updated version of the operating system to the mobile phones. In another embodiment, where the connected devices (102A-N) include set top box devices, the server (104) corresponds to an over-the-air server that is configured to store and provide an updated version of the operating system to the set top box devices. In one embodiment, the delivery system (100) may be communicatively coupled to the test automation system (108) and the sample set of connected devices (110A-N) via the communication network (114).
[0028] In one embodiment, the sample set of devices (110A-N) are representative of the one or more connected devices (102A-N) having the updated version (106) of the firmware to enable testing one or more functionalities of the updated version (106) before delivering the update version (106) to the one or more connected devices (102A-N). In certain embodiments, the updated version (106) includes at least one new feature that is not present in the older version (112) and/or at least one existing feature that has been modified in the updated version (106).
[0029] For example, the older version (112) of a firmware running in the connected devices (102A-N), (e.g., set-top-boxes) may include an exemplary feature that allows a user to add a set of programs under a favorite programs list. In this example, the updated version (106) of the firmware may provide a new feature that allows the user to auto record the set of favorite programs based on preset record time. The updated version (106) may also enable modifications over an existing feature. For example, an existing feature that allows the user to create the favorite programs list may be modified such that the user can prioritize his/her favorite programs in the list. Prioritizing the favorite programs enables the set-top-box to determine which program needs to be recorded on high priority basis when a storage device plugged in to the set-top-box has a limited memory to store all favorite programs.
[0030] In certain embodiments, the sample set of devices (110A-N) are in communication with the test automation system (108) and the server (104) to conduct a stability test of the updated version (106) prior to delivering the updated version (106) to the connected devices (102 A-N). In particular, the test automation system (108) performs the stability test by executing one or more test cases related to the new and/or updated features on the sample set of devices (110A-N) and computes a stability value of the updated version (106) based on test results. Further, in one embodiment, the test automation system (108) enables the server (104) to take a decision regarding the release of the updated version (106) to the one or more connected devices (102A-N) based on the computed stability value. In an embodiment, the test automation system (108) is a processor-enabled device, for example, a laptop, a tablet-computing device, a desktop computer, a cell phone, or a server. The test automation system (108) and the sample set of devices (110A-N) communicate with each other using the communication network (114). Examples of the communication network (114) include, but are not limited to, a Wi-Fi network connection, an Ethernet connection, and a cellular data network connection.
[0031] In one embodiment, the test automation system (108) includes a test controller (116). The test controller (116) selects test cases to be executed on the sample of connected devices (110A-N) from a test case repository (118) in order to compute a stability value associated with the updated version (106). In certain embodiments, the test controller (116) selects the test cases based on historical usage information corresponding to one or more features associated with the older version (112) of the firmware and one or more features associated with the updated version (106) of the firmware. The test controller (116) also determines a corresponding priority to be assigned to each of the selected test cases based on one or more designated rules, as described in detail with reference to FIGS. 2A and 2B. In addition, the test controller (116) executes the selected test cases repeatedly on each of the sample set of connected devices (110A-N) for a designated time. In certain embodiments, a graphical user interface associated with the test automation system (108) enables a user to input the designated time for which the selected test cases are to be executed repeatedly on the sample set of connected devices (110A-N). Alternatively, the test automation system (108) automatically selects the designated time based on previous test runs and/or preprogramed instructions.
[0032] In one embodiment, the delivery system (100) further includes a defect analyzing system (119) that identifies one or more defects that occur upon executing the selected test cases repeatedly on the sample set of connected devices (110A-N) for the designated time. In certain embodiments, the defect analyzing system (119) may be implemented by suitable code on a processor-based system. Examples of such processor-based system include, but are not limited to, a microprocessor, a general-purpose computer, a special-purpose computer, a field programmable gate array, a programmable logic array, and graphics processing units. In one embodiment, the defect analyzing system (119) is communicatively coupled to the sample set of connected devices (110A-N) and the test automation system (108) via the communication network (114). The defect analyzing system (119) identifies different types of defects and communicates the identified defect types to the test automation system (108) via the communication network (114).
[0033] For example, the updated version (106) of the firmware may be configured to enable a new feature that supports streaming of high definition videos, whereas the older version (112) supports only streaming of standard definition videos. In this example, the updated version (106) of the firmware is first installed in the sample set of set-top-boxes (110A-N), and the test controller (116) selects one or more test cases based on historical usage information of features associated with the older version (112) and features associated with the updated version (106) for conducting the stability test. Examples of the historical usage information of the features associated with the older version (112) include the features that are most frequently used by a plurality of users in the past and the features that are most widely used in previously reported stability issues.
[0034] Examples for the most frequently used features in the older version (112) include a content record feature that allows a user to record standard definition videos, a pause feature that allows the user to pause live television programs, a program rewind feature that allows the user to rewind programs, etc. An example for the features that are most widely used features in previously reported stability issues include a remote control connect feature that allows the user to use his/her smart phone as a remote control to execute channel changing functionalities in the set-top-boxes. An example for the features associated with the updated version (106) includes a high definition video feature that enables streaming of high definition videos having 1080 horizontal lines of vertical resolution.
[0035] Subsequent to the test case selection, the test controller (116) executes the selected test cases on the sample set of set-top-boxes (110A-N) repeatedly for a designated time to test the stability of the features associated with the selected test cases. Further, the defect analyzing system (119) may identify one or more defects that occur upon executing the selected test cases on the sample set of set-top-boxes (110A-N) for the designated time. Examples of the identified defects include, but are not limited to, automatic or unintentional reboot of the set-top-boxes (110A-N), user interface (UI) freezes, blurs, pixelations, and macroblockings.
[0036] In an exemplary embodiment, the defect analyzing system (119) identifies one or more UI freezes that have occurred when the sample set of set-top-boxes (110A-N) present one or more received high definition videos on television screens. The defect analyzing system (119) may identify UI freezes by continuously performing image-processing analysis on frames associated with the high definition videos and by determining that scene information associated with the analyzed frames are static for more than a designated threshold time. Apart from UI freezes, the defect analyzing system (119) may be configured to identify various other types of defects. The methodologies by which the defect analyzing system (119) identifies the defects are not explained here in detail for the sake of simplicity.
[0037] In certain embodiments, the defect analyzing system (119) identifies a corresponding time instant and duration between the start of the stability test and occurrences of each of the identified defects. In certain embodiments, the defect analyzing system (119) communicates the identified defects and the corresponding time information to the test automation system (108) via the communication network (114). The test automation system (108) receives the identified defects and the corresponding time information from the defect analyzing system (119), and configures the test controller (116) to generate a test execution report based on the received defect and time information. The test execution report may include one or more corresponding test cases that passed, one or more corresponding test cases that failed, and one or more corresponding defects that occurred during execution of the selected test cases on each of the sample set of connected devices (110A-N).
[0038] In one embodiment, the test automation system (108) includes a defect weighting system (120) configured to determine a corresponding severity to be assigned to each of the identified defects depending upon corresponding types of the identified defects based on the one or more designated rules. For example, a defect reported during the stability test may be same as a defect that was reported more than a designated number of times by a plurality of users in the past when using a feature associated with the older version (112) of the firmware. In this example, the reported defect may be assigned a corresponding variable severity of ‘4’. In another example, if a defect reported during the stability test is associated with a feature that is newly added or that has been modified in the updated version (106), an exemplary corresponding variable severity assigned to the reported defect may be ‘3.’ In yet another example, if the same defect is reported from at least two of the selected test cases executed during the stability test, an exemplary corresponding variable severity assigned to the reported defect may be ‘2’. In another instance, if the same defect is reported during the stability test repeatedly when executing a selected test case more than once, an exemplary corresponding variable severity assigned to the reported defect may be ‘1’.
[0039] In certain embodiments, a stability computing system (122) associated with the test automation system (108) computes an overall stability value of the updated version (106) of the firmware based on corresponding stability values associated with the selected test cases. In certain embodiments, the stability computing system (122) determines the corresponding stability values associated with the selected test cases based on a corresponding test case priority, a total number of the sample set of connected devices (110A-N) subjected to the stability test, and corresponding stability values of the sample set of connected devices (110A-N). The stability computing system (122) computes a corresponding stability value associated with each of the sample set of connected devices (110A-N) using an embodiment of the present method described in detail with reference to FIGS. 2A and 2B.
[0040] In one embodiment, one or more components of the test automation system (108) including the test controller (116), the test case repository (118), the defect weighting system (120), and the stability computing system (122) are integrated as part of the server (104). In such an embodiment, the server (104) directly communicates with the sample set of connected devices (110A-N) over the communication network (114) and executes the stability test to compute the stability value of the updated version (106) using the test automation system (108). The server (104) may be located locally or remotely to the sample set of connected devices (110A-N). The server (104) compares the computed stability value associated with the updated version (106) with the stability value associated with the older version (112) stored in a database (not shown in FIG. 1) associated with the server (104). Subsequently, the server (104) delivers the updated version (104) of the firmware to the one or more connected devices (102A-N) when the stability value of the updated version (106) exceeds the stability value of the older version (112).
[0041] In another embodiment, as shown in FIG. 1, the test automation system (108) may be implemented as a separate system that is independent of the server (104). In this embodiment, the test automation system (108) computes the stability value of the updated version (106) of the firmware and communicates the computed stability value to the server (104) via the communication network (114). The server (104) receives the computed stability value from the test automation system (108) and delivers the updated version (106) of the firmware to one or more of the connected devices (102A-N) upon determining that the computed stability value exceeds a designated threshold value. In one embodiment, the designated threshold value is a stability value associated with the older version (112) of the firmware.
[0042] It may be noted that the foregoing examples, demonstrations, and process steps that may be performed by certain components, for example, by the test case controller (116), the defect weighting system (120), and the stability computing system (122) may be implemented by suitable code on a processor-based system. Examples of such processor-based systems include, but are not limited to, a microprocessor, a general-purpose, a special-purpose computer, a field programmable gate array, a programmable logic array, and graphics processing units. A methodology associated with computing the stability value of the updated version (106) using the test automation system (108) for taking a decision on delivering the updated version (106) to the one or more connected devices (102A-N) is described in further detail with reference to FIGS. 2A and 2B.
[0043] FIGS. 2A and 2B are flow diagrams that depict an exemplary method (200) for delivering the firmware update to the one or more connected devices (102A-N) of FIG. 1. The order in which the exemplary method (200) is described is not intended to be construed as a limitation, and any number of the described blocks may be combined in any order to implement the exemplary method disclosed herein, or an equivalent alternative method. Additionally, certain blocks may be deleted from the exemplary method or augmented by additional blocks with added functionality without departing from the spirit and scope of the subject matter described herein.
[0044] Further, in FIGS. 2A and 2B, the exemplary method is illustrated as a collection of blocks in a logical flow chart, which represents operations that may be implemented in hardware, software, or combinations thereof. The various operations are depicted in the blocks to illustrate the functions that are performed in the exemplary method. In the context of software, the blocks represent computer instructions that, when executed by one or more processing subsystems, perform the recited operations.
[0045] Embodiments of the exemplary method depicted in FIGS. 2A and 2B may be practiced in a distributed computing environment where optimization functions are performed by remote processing devices that are linked through a wired and/or wireless communication network. In the distributed computing environment, the computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0046] In order to test the stability of the updated version (106) of the firmware, at step (202), the server (104) transmits the updated version (106) of the firmware to the sample set of connected devices (110A-N) having characteristics that are substantially similar to the one or more connected devices (102A-N). In one embodiment, the server (104) transmits the updated version (106) of the firmware to the sample set of connected devices (110A-N) via the communication network (114). At step (204), the test controller (116) selects one or more test cases from the test case repository (118) for determining the stability value of the updated version (106) of the firmware. In one embodiment, the test controller (116) selects the one or more test cases to be executed based on at least one of historical usage information of one or more features associated with the older version (112) of the firmware and one or more new features added in the updated version (106) of the firmware.
[0047] In certain embodiments, the historical usage information of the one or more features associated with the older version (112) includes historical information related to most frequently used features by a plurality of users in the past, and most widely used features in previously reported stability issues. Examples of the most frequently used features in the past include a recording feature that allows users to record their favorite programs based on preset time, a voice control feature that allows users to control set-top-boxes based on voice commands, and a text readability feature that allows users to adjust font sizes of subtitles associated with streaming multimedia content.
[0048] An example for the most widely used features in previously reported stability issues includes an ‘auto-play the next episode’ feature that enables set-top-boxes to identify an episode that is to be played next subsequent to a currently streaming episode from an on-demand media content server. The ‘auto-play the next episode’ feature also enables the set-top-boxes to automatically play the identified episode upon completion of streaming of the current episode. This is one such exemplary feature that may have been most widely used when the plurality of users reported stability issues in the past. The stability issues reported by the users may include inability of the set-top-boxes to identify and play appropriate next episodes.
[0049] In addition to the historical usage information of the features associated with the older version (112), the test controller (116) also selects the test cases to be executed based on one or more features associated with the updated version (106) of the firmware. The one or more features associated with the updated version (106) include features that are newly added in the updated version (106) and features that have been modified with respect to the older version (112).
[0050] An exemplary feature that may be newly added in the updated version (106) includes a ‘show-only subscribed channels’ feature that configures the set-top-boxes to enable televisions to display only subscribed TV channels to users, whereas blank screens associated with unsubscribed TV channels may not be displayed while the users change the TV channels. An exemplary feature that may be modified with respect to the older version (112) includes a ‘record priority indication’ feature that allows the set-top-boxes to decide at least one program to be recorded on high priority basis when a storage device plugged in to the set-top-boxes has a limited memory to store all favorite programs.
[0051] In certain embodiments, the test controller (116) selects the test cases to be executed on the sample set of devices (110A-N) based on one or more features that are commonly available in both the older version (112) and the updated version (106) of the firmware. Examples of the commonly available features include a gaming feature, a parental control feature, a fast forward feature, a rewind feature, a slow motion feature, etc.
[0052] Upon selection of the test cases from the test case repository (118), at step (206), the test controller (116) determines a corresponding priority to be assigned to each of the selected test cases based on one or more designated rules. More specifically, the test controller (116) determines the corresponding priority to be assigned to each designated test case from the selected test cases based on one or more variable scores associated with one or more corresponding features included in the designated test case.
[0053] For example, FIG. 3 is a table view (300) that depicts a corresponding variable score (302) assigned to each of the features (304) to be tested during the stability test using the designated rules. Exemplary corresponding variable scores assigned to the features including the most frequently used features and the most widely used features identified in previously reported stability issues are ‘4’ and ‘3,’ respectively. Similarly, an exemplary corresponding variable score assigned to the features that are newly added and the features that have been modified in the updated version (106) is ‘2,’ and an exemplary variable score assigned to the basic features is ‘1.’ Based on the corresponding variable score (302) associated with each of the features (304), the test controller (116) determines the corresponding priority associated with each of the selected test cases, as explained in detail with reference to FIG. 4.
[0054] FIG. 4 is another table view (400) that depicts the corresponding priority (402) determined for exemplary set of selected test cases (404A-E) based on variable scores associated with the features (304). For example, the test case 1 (404A) may be related only to testing the most frequently used features by the plurality of users in the past, and the most widely used features identified in previously reported stability issues. In this example, the test controller (116) determines a corresponding priority to be assigned to the test case 1 (404A), for example, in accordance with equation (1)
PTCi= (?¦[VS]TCi)/(?_(i=1)^n¦[VS]TCi) (1)
where the PTCi represents a corresponding priority associated with a selected test case, ?¦[VS]TCi represents a sum of variable scores of features included in the selected test case, and ?_(i=1)^n¦[VS]TCi represents a sum of variable scores of features included in all selected test cases.
[0055] As depicted in FIG. 4, the exemplary features (304) that are included in the test case 1 (404A) include the features 1 and 2. A summation of the variable score (e.g., 4) associated with the feature 1 and the variable score (e.g., 3) associated with the feature 2 is 7. Further, features 1 and 3 that are included in the test case 2 (404B), features 2 and 3 are included in the test case 3 (404C), features 2 and 4 are included in the test case 4 (404D), and features 1, 2, 3 and 4 are included in the test case 5 (404E). A summation of the variable scores associated with the exemplary features (304) that are included in all five test cases (404A-E) is 32. In this example, the test controller (116) determines the corresponding priority associated with the test case 1 (404A) as approximately 0.22 based on the sum of variable scores (i.e., 7) of features included in the test case 1 (404A) and the sum of variable scores (i.e., 32) of features included in all test cases, in accordance with the equation (1). Similarly, it is to be understood that the test controller (116) is configured to determine the corresponding priority (402) for each of the other selected test cases (404B-E) based on variable scores associated with one or more corresponding features (406) included in the selected test cases (404B-E).
[0056] Referring back to description of FIG. 2A, subsequent to determination of the corresponding priority (402) for each of the selected test cases (404A-E), at step (208), the test controller (116) subjects each of the sample set of connected devices (110A-N) to the stability test by repeatedly executing the selected test cases for a designated time. In certain embodiments, a designated number of the sample set of connected devices (110A-N) that are to be subjected to the stability test is selected manually. Further, the designated time for which the selected test cases to be executed repeatedly on the sample set of connected devices (110A-N) is provided as an input to the test controller (116) using a graphical user interface associated with the test automation system (108).
[0057] At step (210), one or more defects that occur upon executing the selected test cases repeatedly on the sample set of set of connected devices (110A-N) for the designated time are determined using the defect analyzing system (119), as described previously with reference to FIG. 1. Further, at step (210), a corresponding time between the start of the stability test and corresponding occurrences of each of the identified defects is identified using the defect analyzing system (119).
[0058] At step (212), the identified defects and the corresponding time between the start of the stability test and corresponding occurrences of each of the identified defects are communicated from the defect analyzing system (119) to the defect weighting system (120) of the test automation system (108) via the communication network (114). At step (214), a corresponding variable severity to be assigned for each of the identified defects is determined depending upon types of the identified defects based on the one or more designated rules, as explained previously with reference to FIG. 1.
[0059] At step (216), a corresponding stability value associated with each of the sample set of connected devices (110A-N) is determined upon executing each of the selected test cases on the sample set of connected devices (110A-N).
[0060] FIG. 5 is a table view (500) that depicts an exemplary summary of test results associated with execution of the exemplary test cases (404A-E) of FIG. 4 on four different sample set of set-top-boxes (502A-D) for an exemplary designated time of 24 hours. The test case 1 (404A) is executed on the sample set of set-top-boxes (502A-D) repeatedly for 24 hours, during which, the test case 1 (404A) is identified to have been successfully executed on three set-top-boxes (502A-C). However, the test case 1 (404A) is identified to have failed on one (502D) of the four set-top-boxes, the set-top-box (502D) having rebooted automatically once at the 10th hour from the start of the stability test. In this scenario, an exemplary corresponding variable severity assigned to the identified automatic rebooting defect using the defect weighting system (120) is ‘5’.
[0061] In addition, when executing the test case 2 (404B), the test case 2 (404B) is identified to have been successfully executed on only two set-top-boxes (502B and 502D) and is further identified to have failed on other two set-top-boxes (502A and 502C). User interface (UI) freezes are determined to have occurred twice when executing the test case 2 (404B) on the other two set-top-boxes (502A and 502C) at 10th and 12th hours, respectively from the start of the stability test. An exemplary corresponding variable severity assigned to the identified defect using the defect weighting system (120) is ‘4’. Further, all other test cases (404C, 404D, and 404E) are all determined to have been successfully executed on all of the set-top-boxes (502A-D) that are being subjected to the stability test. For the exemplary test results depicted in FIG. 5, the stability computing system (122) determines the corresponding stability value associated with each of the sample set of set-top-boxes (502A-D) upon executing each of the selected test cases (404A-E), as described in detail with reference to FIG. 6.
[0062] FIG. 6 is a table view (600) that depicts the corresponding stability value (602) determined for each of the four-sample set-top-boxes (502A-D) of FIG. 5 upon executing each of the exemplary selected test cases (404A-E) on the four-sample set-top-boxes (502A-D) and a corresponding stability value (604) determined for each of the exemplary selected test cases (404A-E). In certain embodiments, the stability computing system (122) is configured to determine the corresponding stability value (602) associated with a designated set-top-box selected from the set-top-boxes (502A-D) as 1 when a selected test case is executed successfully on the designated set-top-box. For example, the stability computing system (122) is configured to determine the corresponding stability value (602) associated with the set-top-box (502A) as 1 when the selected test case (404A) is executed successfully on the set-top-box (502A). Similarly, the stability computing system (122) determines the corresponding stability value (602) associated with the set-top-boxes (502B and 502C) also as 1 because the selected test case (404A) is also executed successfully on the set-top-boxes (502B and 502C).
[0063] In one embodiment, the stability computing system (122) is configured to determine the corresponding stability value (602) associated with a designated set-top-box selected from the set-top-boxes (502A-D), in accordance with an equation (2) when a selected test case has failed on the designated set-top-box.
Stability Value = (PTCi * DRT) / (DS * TD) (2)
where PTCi represents the corresponding priority associated with the selected test case that has failed, DRT corresponds to a defect reporting time that is a time between the start of the stability test and a corresponding occurrence of the identified defect when executing the selected test case, DS represents a corresponding defect severity associated with the identified defect, and TD corresponds to a total duration of the stability test.
[0064] For example, when the test case 1 (404A) is identified to have failed on the set-top-box (502D), the set-top-box (502D) having rebooted automatically once at the 10th hour from the start of the stability test, the corresponding variable severity of the identified defect is assigned a value of 5. In this example, the stability computing system (122) determines the corresponding stability value (602) associated with the set-top-box (502D) as approximately 0.08, in accordance with the equation (2). Exemplary values used in the equation (2) for determining the corresponding stability value (602) associated with the set-top-box (502D) includes the corresponding priority of 0.22 associated with the selected test case (404A), the defect reporting time being 10th hour from the start of the stability test, the defect severity being 5, and the total duration being 24 hours. It is to be understood that the stability computing system (122) is configured to similarly determine the corresponding stability value (602) associated with each of the sample set of set-top-boxes (502A-D) upon executing each of the other selected test cases (404B-E) on the sample set of set-top-boxes (502A-D).
[0065] Referring back to FIG. 2B, at step (218), a corresponding stability value associated with each of the selected test cases is determined. In certain embodiments, the stability computing system (122) determines the corresponding stability value associated with each of the selected test cases, in accordance with an equation (3).
SVTCi = [?_(i=1)^n¦?[SV]SCBi?)*PTCi] / TSCD (3)
where SVTCi represents a corresponding stability value associated with a selected test case, ?_(i=1)^n¦?[SV]STBi?) corresponds to a sum of corresponding stability values of the sample set of connected boxes (110A-N) determined upon executing the selected test case on the sample set of connected boxes (110A-N), PTCi represents the corresponding priority associated with the selected test case, and TSCD corresponds to a total number of the sample set of connected devices (110A-N) that are being subjected to the stability test.
[0066] For example, as depicted in FIG. 6, the stability computing system (122) determines the corresponding stability value (604) associated with the test case 1 (404A) as 0.17 based on the sum of stability values associated with the set-top-boxes (502A-D) when the test case 1 (404A) is being executed on the set-top-boxes (502A-D). Further, the corresponding stability value (604) associated with the test case 1 (404A) is also determined based on the corresponding priority (e.g., 0.22) associated with the test case 1 (404A), and the total number (i.e., 4) of set-top-boxes (504A-D) that are being subjected to the stability test using the equation (3). Similarly, it is to be understood that the stability computing system (122) determines the corresponding stability value (604) associated with each of the other test cases (404B-E) based on the corresponding stability values (602) of the set-top-boxes (504A-D), the corresponding priority, and the total number of set-top-boxes (504A-D) subjected to the stability test, as depicted in FIG. 6.
[0067] Referring back to FIG. 2B, at step (220), the stability value of the updated version (106) of the firmware is determined based on the corresponding stability value associated with each of the selected test cases. More specifically, in one embodiment, the stability computing system (122) determines the stability value of the updated version (106) of the firmware by summing up the corresponding stability values associated with each of the selected test cases. For instance, for the selected test cases (404A-E) that are executed on the sample set of set-top-boxes (504A-D), the stability computing system (122) determines the stability value as 0.87 by summing up the corresponding stability value (604) associated with each of the selected test cases (404A-E). In addition, in an embodiment, the stability computing system (122) further converts the determined stability value of the updated version (106) of the firmware into a percentage value. For example, the stability computing system (122) converts the determined exemplary stability value of 0.87 associated with the updated version (106) of the firmware into the percentage value as 87 percent.
[0068] Upon determination of the stability value associated with the updated version (106) of the firmware, at step (222), the determined stability value is communicated to the server (104) of FIG. 1. At step (224), the determined stability value associated with the updated version (106) is compared with a designated threshold stability value stored in a database associated with the server (104) to identify if the determined stability value exceeds the designated threshold stability value. In certain embodiments, the designated threshold stability value corresponds to a stability value associated with the older version (112) of the firmware. At step (226), the updated version (106) of the firmware is automatically delivered to at least one device selected from the plurality of connected devices (102A-N) upon identifying that the determined stability value exceeds the designated threshold stability value.
[0069] Alternatively, if the determined stability value is less than the designated threshold stability value, the server (104) would not deliver the updated version (106) of the firmware to the plurality of connected devices (102A-N). Further, the server (104) sends a request to the test automation system (108) via the communication network (114) to receive the test execution report generated by the test controller (116). As previously mentioned, the test execution report may include one or more corresponding test cases that passed, one or more corresponding test cases that failed, and one or more corresponding defects that occurred during execution of the stability test. The server (104) then receives the test execution report from the test automation system (108) and communicates an alert message to a user device associated with a selected user regarding stability issues associated with the updated version (106) along with the received test execution report.
[0070] The delivery system (100), described throughout the various embodiments described herein, enables numerically determining the stability value associated with various types of products including software products, firmware products, and hardware products. An objective measurement of the stability value of the products provides full confidence regarding the product’s release decision. The delivery system (100) enables the server (104) to make a more informed decision regarding whether to release the product update or not to avoid any loss of functionality or availability of the existing version, while ensuring the new features function as desired.
[0071] Further, the stability computing system (122) of the delivery system (100) differs from conventional software quality monitoring systems that monitor a quality of a software during a software development life cycle by subjecting the software to a manual testing procedure or an automated testing procedure. The stability computing system (122) enables identification of the capacity of a product to function properly for a longer duration by first executing test cases, selected based on the updated version (106) and the historical usage information of the older version (112), on the sample set of connected devices (110A-N) before releasing the updated version (106) to end user devices (102A-N). Thus, the stability computing system (122) ensures proper functioning of the updated version (112) of the product for the longer duration after the updated version (112) is delivered to the end user devices.
[0072] Further, the delivery system (100) enables a comparison between an objectively measured stability value of the older version (112) and the stability value of the updated version (106) and assists in identifying the stability of the product across different versions. Further, the test controller (116) selects test cases associated with only selected features of the older version (112) and new features in the updated version (106) for executing the stability test. Therefore, execution of the stability test does not require a long time, thereby allowing organizations to quickly perform the stability test and reliably take a release decision.
[0073] Although specific features of various embodiments of the present systems and methods may be shown in and/or described with respect to some drawings and not in others, this is for convenience only. It is to be understood that the described features, structures, and/or characteristics may be combined and/or used interchangeably in any suitable manner in the various embodiments shown in the different figures.
[0074] While only certain features of the present systems and methods have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the claimed invention.
| # | Name | Date |
|---|---|---|
| 1 | 201841004120-STATEMENT OF UNDERTAKING (FORM 3) [03-02-2018(online)].pdf | 2018-02-03 |
| 2 | 201841004120-REQUEST FOR EXAMINATION (FORM-18) [03-02-2018(online)].pdf | 2018-02-03 |
| 3 | 201841004120-POWER OF AUTHORITY [03-02-2018(online)].pdf | 2018-02-03 |
| 4 | 201841004120-FORM 18 [03-02-2018(online)].pdf | 2018-02-03 |
| 5 | 201841004120-FORM 1 [03-02-2018(online)].pdf | 2018-02-03 |
| 7 | 201841004120-DRAWINGS [03-02-2018(online)].pdf | 2018-02-03 |
| 8 | 201841004120-DECLARATION OF INVENTORSHIP (FORM 5) [03-02-2018(online)].pdf | 2018-02-03 |
| 9 | 201841004120-COMPLETE SPECIFICATION [03-02-2018(online)].pdf | 2018-02-03 |
| 10 | abstract 201841004120.jpg | 2018-02-06 |
| 11 | Form5_After Filing_19-03-2018.pdf | 2018-03-19 |
| 12 | Form26_Power of Attorney_19-03-2018.pdf | 2018-03-19 |
| 13 | Form1_After Filing_19-03-2018.pdf | 2018-03-19 |
| 14 | Correspondence by Agent_Form1,Form5,Form26_19-03-2018.pdf | 2018-03-19 |
| 15 | 201841004120-FORM 3 [11-06-2021(online)].pdf | 2021-06-11 |
| 16 | 201841004120-FER_SER_REPLY [11-06-2021(online)].pdf | 2021-06-11 |
| 17 | 201841004120-COMPLETE SPECIFICATION [11-06-2021(online)].pdf | 2021-06-11 |
| 18 | 201841004120-CLAIMS [11-06-2021(online)].pdf | 2021-06-11 |
| 19 | 201841004120-FER.pdf | 2021-10-17 |
| 20 | 201841004120-PatentCertificate06-04-2023.pdf | 2023-04-06 |
| 21 | 201841004120-IntimationOfGrant06-04-2023.pdf | 2023-04-06 |
| 1 | Search_Strategy_201841004120E_04-12-2020.pdf |