Abstract: Exemplary embodiments of the present disclosure are directed towards an automation method for testing a performance of an outage management system (OMS).The method includes initiating an automation process comprising a step of editing one or more file paths for identifying network address of the OMS under test and image name of the OMS under test, whereby the OMS under test comprising a plurality of automation scripts for performing the automation and generating a plurality of test reports in response to testing the performance of the OMS under test, whereby the plurality of test reports comprising a plurality of configuration parameters to be updated in the OMS under test.
DESC:This is a complete specification for the provisional application 1637/CHE/2014 filed on 27-May-2014.
TECHNICAL FIELD
[1] The present disclosure generally relates to the field of performance measurement of outage management systems (OMS). More particularly, the present disclosure relates to a system and method for automating performance tests of outage management systems.
BACKGROUND
[2] Large storms often cause multiple power outages in various portions of the transmission and distribution system. In response, electric utilities typically send maintenance crews into the field to perform the repairs. If the storm is large enough, maintenance crews are often borrowed from neighbouring electric utilities and from external contracting agencies. Dispatching the crews in an efficient manner, therefore, is important for the quick and efficient restoration of power.
[3] Conventional techniques for maintenance crew dispatch include dispatching the crews straight from a central operation centre. Once the storm hits, the electric utility then determines where to send the crews based on telephone calls from consumers. Outage management systems (OMSs) log customer calls and dispatches crews to the site of the disturbance based on the customer calls. Some of the key goals of outage management are to reduce the time it takes to identify a power outage, reduction of the time it takes to bring power up after an outage, reduction of truck rolls to fix an outage, and fixing an outage before customers notice and call the utility.
[4] Commercially available off the shelf (COTS) generic Outage Management System (OMS) is a solution that provides dispatchers and system operators with exceptional workflow capability and a user-friendly interface. Such systems and their associated applications are configurable and can support outage management process and procedures. With this support, operational efficiencies and communications are improved, thereby helping to reduce dispatching bottlenecks that can occur during high-volume outage conditions. When implementing such OMS product suite, power electric utility enterprise can distribute emergency, planned and follow-up work in a manner that streamlines the dispatching of work orders. These OMS systems generally reduce the time necessary to manage and restore customers during large storms by several hours, resulting in crew and vehicle savings of over several millions of dollars.
[5] The OMS has to be checked for its performance so that it is ready to face the storm situation and work without fail in that situation.
[6] Electric power utility companies currently have a manual process of performing performance tests on generic commercially off-the-shelf (COTS) available Electric power Outage management systems (OMS) like for example “GE PowerOn OMS” system. Electric power utility companies have identified dispatcher user actions (and classified them into modelling and non-modelling actions) for which performance index is to be calculated by recording average time taken for each of the actions under a given load scenario. As part of a performance test, prior art systems and solutions operated by Electric power utility companies also monitor some of the interfaces to COTS OMS, performance of database and application server and analyse the effect of load on each of the servers.
[7] For the performance test of an OMS in storm condition, a number of call requests are simulated and the user who manages the OMS starts OMS sessions and as per the simulated calls perform the actions, which they will perform in case of real storm situation. The user actions and the actions of the OMS times are noted and the total time is finally used to know the performance.
[8] This method involves a lot of manpower and use lot of computers to perform the tests. There is a need for a system and method that automates most of the actions performed during the test, thereby reducing the use of manpower and improving the accuracy of the tests.
BRIEF SUMMARY
[9] The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
[10] Exemplary embodiments of the present disclosure are directed towards a system and method for automating performance tests of outage management systems.
[11] According to an exemplary aspect, the method includes initiating an automation process comprising a step of editing one or more file paths for identifying network address of the OMS under test and image name of the OMS under test, whereby the OMS under test comprising a plurality of automation scripts for performing the automation.
[12] According to an exemplary aspect, the method further includes generating a plurality of test reports in response to testing the performance of the OMS under test, whereby the plurality of test reports comprising a plurality of configuration parameters to be updated in the OMS under test.
BRIEF DESCRIPTION OF DRAWINGS
[13] Other objects and advantages of the present invention will become apparent to those skilled in the art upon reading the following detailed description of the preferred embodiments, in conjunction with the accompanying drawings, wherein like reference numerals have been used to designate like elements, and wherein:
[14] FIG. 1 is a diagram depicting a system for automating the performance test of outage management system (OMS), according to an exemplary embodiment of the present disclosure.
[15] FIG. 2 is a diagram depicting data transfer in a user session between framework managing unit and load simulator, according to an exemplary embodiment of the present disclosure.
[16] FIG. 3 is a flow diagram depicting a method for automating performance tests of outage management systems, according to an exemplary embodiment of the present disclosure.
[17] FIG. 4 is a flow diagram depicting a method for calculating performance index of outage management systems, according to an exemplary embodiment of the present disclosure.
DETAILED DESCRIPTION
[18] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
[19] The use of “including”, “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms “first”, “second”, and “third”, and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
[20] According to a non-limiting exemplary embodiment of the present disclosure, a system and method for automating performance tests of outage management systems are disclosed.
[21] Referring to FIG. 1 is a diagram 100 depicting a system for automating the performance test of outage management system (OMS), according to an exemplary embodiment of the present disclosure. The system comprising a framework managing unit 102, load simulators 104a, 104b and 104c,Unified Functional Testing(UFT)clients 106a, 106b and 106c and a server 108. The framework managing unit 102 is connected to the load simulators 104a, 104b and 104c, the UFT clients 106a, 106b and 106c and the server 108. The framework managing unit 102 controls and monitorsthe outage management system (OMS). The framework managing unit 102 further includes tables in rows and columns to store framework configuration and data from the load simulators 104a, 104b and 104c and the UFT clients106a, 106b and 106c.
[22] For example the system and method of the outage management system is implemented for GE PowerOn OMS system is described below. However, the present system is not limited to a particular system and method for automating the performance test of an OMS it can be implemented for any OMS.
[23] The server 108, initiates the automation process by allowing the user to open the framework managing unit 102 graphical user interface (GUI) located in the server 108 and click on the start button for starting the automation process.
Executable files for framework managing unit GUI
Let us consider the exemplary batch files as:
start_po_session.bat
stop_po_session.bat
[24] The above mentioned exemplary batch files are edited by the user in the server 108 with the PowerOn server 110 IP address and image name with the respective automation PowerOn image built with the testing scripts for performing automation process. The testing scripts may include, but not limited to magik scripts, UFT scripts, PL/SQL scripts.
[25] The framework managing unit 102 is provided with a switch to manage the automation framework by allowing the user to start or stop the testing process. The framework managing unit 102 stores different configurations for the framework such as number of iterations for each action, actions to be performed on each session, sequence of the actions, sequence of devices, PowerOn application image details, and the like.
[26] The framework managing unit 102 initiates PowerOn user sessions on the load simulators 104a, 104b and 104c and directly triggers the UFT clients 106a, 106b and 106c with testing scripts. The framework managing unit 102 assigns next user action and device on which the action is to be performed to each active PowerOn user sessions on the load simulators 104a, 104b and 104c and the UFT clients 106a, 106b and 106c. The framework managing unit 102 assigns test data to each active PowerOn user sessions and record the average time, minimum time, maximum time and number of actions completed with 5 sec or >5 and <10 and the like, taken by each of the user action on each of the active user session and calculate the performance index for testing.
[27] The performance index is calculated by simulating real time requests from clients. The requests are triggered for OMS, the OMS sessions are automatically triggered and user actions to tackle the requests performed in OMS. The performance indexes are calculated by recording the average time, minimum time, maximum time and number of actions completed with 5 sec or >5 and <10 and the like,taken for the actions performed by the user. The calculated time is stored in a database and the time is the measure of the performance of the OMS.
[28] The automation process is terminated and the test reports are generated. The test reports are generated in the server 108 using the framework managing unit 102 GUI. The users can provide the list of parameters to be included in the automation process and can provide credentials to PowerOn user session to be opened on the client’s device and update the configuration parameters in their respective tables in the server 108.
[29] Referring to FIG. 2 is a diagram 200 depicting data transfer in a user session between framework managing unit and load simulator, according to an exemplary embodiment of the present disclosure. The framework managing unit 102 initiates the user sessions on client devices by session monitor process (SMP) 202 and user action simulator (UAS) 204. The sessions monitor process (SMP) 202 and user action simulator (UAS) 204 makes a user session. The SMP 202 periodically transmits a message to framework managing unit 102. The message may be a keep alive, periodic message, heartbeat message, and the like without limiting the scope of the disclosure. The message hereinafter referred as a heartbeat message. The heartbeat message enables the framework managing unit 102 to alert on the termination of user session and the framework managing unit 102 transmits a message to SMP 202 to restart a new user session on the client.
[30] The SMP 202 updates the time table periodically with the message time. The SMP 202 accepts a message from the framework managing unit 102 to start UAS 204 process on client if the UAS 204 is terminated. The UAS 204 executes the user actions and updates the execution process data to the framework managing unit 102.
[31] Referring to FIG. 3 is a flow diagram 300 depicting a method for automating performance tests of outage management systems, according to an exemplary embodiment of the present disclosure. The method starts at step 302 by opening the performance automation tool. The method continues to next step 304 for initiating the automation process by selecting start button to trigger start_po_session.bat file to start a UAS session. The method continues to next step 306 by assigning the test data to UAS by the framework managing unit. The method continues to next step 308 by triggering the batch files with credentials. The method continues to next step 3010 by opening user session(s) with a window in each client. The method continues to next step 312 by reading the next order user action from table by UAS and update current order with next order id. The method continues to next step 314 by checking the order if it is valid the method continues to next step, if it is not valid the method returns to step 312. Executing the user actions by UAS and updating the timestamps in table at step 316. The method further continues to next step 318 by terminating the process by selecting stop button to trigger stop_po_session.bat file or to stop all UFT and Magik sessions automatically.
[32] Referring to FIG. 4 is a flow diagram 400 depicting a method for calculating performance index of outage management systems, according to an exemplary embodiment of the present disclosure. The method starts at step 402 by simulating requests for outage management system (OMS) from client. The method continues to next step 404 by triggering the requests for OMS. The method continues to next step 406 by automatically triggering the OMS sessions and user actions to tackle the requests performed in OMS. The method continues to next step 408 by calculating the performance indexes by recording the average time,minimum time, maximum time and number of actions completed with 5 sec or >5 and <10 and the like, taken for each of the actions performed by the user. The method continues to next step 410 by storing the calculated time in a database and the time is the measure of the performance of the OMS.
[33] The exemplary embodiment described herein detail for illustrative purposes are subject to many variations and structure and design. It should be emphasized, however that the present invention is not limited to particular system and method of automating the performance test of an OMS, as shown and described.
[34] Rather, the principles of the present invention can be used with a variety of systems and method configurations and structural arrangements. It is understood that various omissions, substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but the present invention is intended to cover the application or implementation without departing from the spirit of the disclosure.
[35] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub combinations of the various features described herein above as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description. ,CLAIMS:1. An automation method for testing a performance of an outage management system (OMS), the method comprising:
initiating an automation process comprising a step of editing one or more file paths for identifying network address of the OMS under test and image name of the OMS under test, whereby the OMS under test comprising a plurality of automation scripts for performing the automation; and
generating a plurality of test reports in response to testing the performance of the OMS under test, whereby the plurality of test reports comprising a plurality of configuration parameters to be updated in the OMS under test.
2. The method of claim 1 further comprising a step of initiating the automation process from a server.
3. The method of claim 1 further comprising calculating performance index comprising:
simulating a plurality of requests for outage management system (OMS) from client;
triggering the plurality of requests for OMS;
automatically triggering and stopping the OMS sessions and a plurality of user actions to tackle the plurality of requests performed in OMS;
calculating the performance indexes by recording the average time taken for each of the actions performed by the user; and
storing the calculated time in a database and the time is the measure of the performance of the OMS.
4. A system comprising:
a framework managing unit for controlling and monitoring the outage management system (OMS) under test, whereby the framework managing unit initiates a plurality of user sessions;
a plurality of load simulators comprising the plurality of user sessions running a testing script for performing a list of user actions according to a test data;
a Session Monitor Process (SMP) and a User Action Simulator (UAS)to initiate the plurality of user sessions on a client device, whereby a periodic message is sent from the SMP to the framework managing unit to indicate the termination of a user session, where by the framework managing unit transmits a message in return to the SMP to restart a new user session;
a plurality of UFT clients initiating user sessions on a UFT client device running testing scripts for performing a list of user actions according to the test data;
a server for initiating the automation process comprising a step of editing one or more file paths for identifying network address of the OMS under test and image name of the OMS under test, whereby the OMS under test comprising a plurality of automation scripts for performing the automation;
a plurality of test reports generated in response to testing the performance of the OMS under test, whereby the plurality of test reports comprising a plurality of configuration parameters to be updated in the server.
5. The system of claim 3, wherein the framework managing unit provides a switch to manage the automation framework allowing the user to start and stop the testing process.
6. The system of claim 3, wherein the framework managing unit further comprises a plurality of tables to store framework configuration and data from the plurality of load simulators and the plurality of UFT clients.
7. The system of claim 3, wherein the framework managing unit assignsa user action and a client device prior to the action to be performed for each of the active user sessions.
8. The system of claim 3, wherein the framework managing unit assign the test data to each of the active user sessions.
9. The system of claim 3, wherein the framework managing unit records the average time taken by each of the actions on each of the active user session and calculate the performance index for testing.
| # | Name | Date |
|---|---|---|
| 1 | 1637-CHE-2014 FORM-3 27-03-2014.pdf | 2014-03-27 |
| 1 | 1637-CHE-2014-AbandonedLetter.pdf | 2019-11-26 |
| 2 | 1637-CHE-2014 FORM-2 27-03-2014.pdf | 2014-03-27 |
| 2 | 1637-CHE-2014-FER.pdf | 2019-05-23 |
| 3 | Form 18 [02-09-2016(online)].pdf | 2016-09-02 |
| 3 | 1637-CHE-2014 FORM-1 27-03-2014.pdf | 2014-03-27 |
| 4 | Form 13 [29-08-2016(online)].pdf | 2016-08-29 |
| 4 | 1637-CHE-2014 CORRESPONDENCE OTHERS 27-03-2014.pdf | 2014-03-27 |
| 5 | Form 26 [29-08-2016(online)].pdf | 2016-08-29 |
| 5 | 1637-CHE-2014 DESCRIPTION (PROVISIONAL) 27-03-2014.pdf | 2014-03-27 |
| 6 | Request For Certified Copy-Online.pdf | 2016-01-30 |
| 6 | 1637-CHE-2014 FORM-13 26-03-2015.pdf | 2015-03-26 |
| 7 | REQUEST FOR CERTIFIED COPY [30-07-2015(online)].pdf | 2015-07-30 |
| 7 | POA.pdf | 2015-03-27 |
| 8 | Form 13.pdf | 2015-03-27 |
| 8 | Drawings.pdf | 2015-05-18 |
| 9 | Form 2.pdf | 2015-05-18 |
| 9 | OnlinePostDating.pdf | 2015-04-15 |
| 10 | Form 2.pdf | 2015-05-18 |
| 10 | OnlinePostDating.pdf | 2015-04-15 |
| 11 | Drawings.pdf | 2015-05-18 |
| 11 | Form 13.pdf | 2015-03-27 |
| 12 | POA.pdf | 2015-03-27 |
| 12 | REQUEST FOR CERTIFIED COPY [30-07-2015(online)].pdf | 2015-07-30 |
| 13 | 1637-CHE-2014 FORM-13 26-03-2015.pdf | 2015-03-26 |
| 13 | Request For Certified Copy-Online.pdf | 2016-01-30 |
| 14 | 1637-CHE-2014 DESCRIPTION (PROVISIONAL) 27-03-2014.pdf | 2014-03-27 |
| 14 | Form 26 [29-08-2016(online)].pdf | 2016-08-29 |
| 15 | 1637-CHE-2014 CORRESPONDENCE OTHERS 27-03-2014.pdf | 2014-03-27 |
| 15 | Form 13 [29-08-2016(online)].pdf | 2016-08-29 |
| 16 | 1637-CHE-2014 FORM-1 27-03-2014.pdf | 2014-03-27 |
| 16 | Form 18 [02-09-2016(online)].pdf | 2016-09-02 |
| 17 | 1637-CHE-2014 FORM-2 27-03-2014.pdf | 2014-03-27 |
| 17 | 1637-CHE-2014-FER.pdf | 2019-05-23 |
| 18 | 1637-CHE-2014-AbandonedLetter.pdf | 2019-11-26 |
| 18 | 1637-CHE-2014 FORM-3 27-03-2014.pdf | 2014-03-27 |
| 1 | 2019-05-2316-03-35_23-05-2019.pdf |