Abstract: A method and a system are described for determining an effectiveness index of a software test environment. The method includes receiving a plurality of factors associated with a plurality of software modules of a plurality of software under test from a plurality of external systems. The method includes identifying the software test environment based on one or more values associated with each of the plurality of factors. The method includes collecting failure logs of each of the plurality of factors for the identified software test environment. The method includes assigning a score to each of the plurality of factors based on collected failure logs. The method includes determining an effectiveness index of the identified software test environment for each of the plurality of software modules based on the score. Fig.3
Claims:
WE CLAIMS
1. A method for determining an effectiveness index of a software test environment, the method comprising:
receiving, by a software testing system, a plurality of factors associated with a plurality of software modules of a plurality of software under test from a plurality of external systems;
identifying, by the software testing system, the software test environment based on one or more values associated with each of the plurality of factors;
collecting, by the software testing system, failure logs of each of the plurality of factors for the identified software test environment;
assigning, by the software testing system, a score to each of the plurality of factors based on collected failure logs; and
determining, by the software testing system, an effectiveness index of the identified software test environment for each of the plurality of software modules based on the score.
2. The method of claim 1, wherein the plurality of factors associated with the software modules comprises previous software modules deployment results, flags, table structure, complexity, external connectivity, critical functionality, and base functionalities.
3. The method of claim 1, further comprising collating the plurality of factors into a data structure, wherein the plurality of factors corresponds to software modules relation information, and wherein the plurality of factors affects stability of the software test environment.
4. The method of claim 1, further comprising classifying the plurality of software modules into the plurality of software under test based on input received from a user, wherein each of the plurality of software modules are associated with a unique software under test from the plurality of software under test.
5. The method of claim 1, further comprising creating a repository of the plurality of software modules, wherein the repository comprises software module information of each of the plurality of software modules.
6. The method of claim 1, further comprising aggregating the one or more values associated with each of the plurality of factors to compute a software module stability score for each of the plurality of software modules, wherein the software module stability score is indicative of stability of the identified test environment for each of the plurality of software modules.
7. The method of claim 6, further comprising computing a software under test stability score based on the software module stability score of each of the plurality of software modules.
8. The method of claim 1, wherein the plurality of external systems comprises point of sale terminal device, credit card external systems, printers, scanners, and handheld devices.
9. The method of claim 1, further comprising updating the one or more values associated with each of the plurality of factors to ensure continuous availability of the identified test environment.
10. The method of claim 1, further comprising determining a degree of planning of each of the plurality of software under test based on input received from a user, and software module stability score for each of the plurality of software modules associated with the plurality of software under test.
11. The method of claim 10, further comprising providing one or more recommendations to optimize the degree of planning of each of the plurality of software under test based on the software module stability score for each of the plurality of software modules associated with the plurality of software under test, wherein the one or more recommendations comprises at least one of reduction in resources within the software under test, increase of resources within the software under test, or sharing of resources within the plurality of software under test.
12. A software testing system to determine an effectiveness index of a software test environment, comprising:
a processor; and
a memory communicatively coupled to the processor, wherein the memory stores processor instructions, which, on execution, causes the processor to:
receive a plurality of factors associated with a plurality of software modules of a plurality of software under test from a plurality of external systems;
identify the software test environment based on one or more values associated with each of the plurality of factors;
collect failure logs of each of the plurality of factors for the identified software test environment;
assign a score to each of the plurality of factors based on collected failure logs; and
determine an effectiveness index of the identified software test environment for each of the plurality of software modules based on the score.
13. The software testing system of claim 12, wherein the plurality of factors associated with the software module comprises previous software modules deployment results, flags, table structure, complexity, external connectivity, critical functionality, and base functionalities.
14. The software testing system of claim 12, wherein the processor is further configured to collate the plurality of factors into a data structure, wherein the plurality of factors corresponds to software modules relation information, and wherein the plurality of factors affects stability of the software test environment.
15. The software testing system of claim 12, wherein the processor is further configured to classify the plurality of software modules into a plurality of software under test based on input received from a user, wherein each of the plurality of software modules are associated with a unique software under test from the plurality of software under test.
16. The software testing system of claim 12, wherein the processor is further configured to comprise create a repository of the plurality of software modules, wherein the repository comprises software module information of each of the plurality of software modules.
17. The software testing system of claim 12, wherein the processor is further configured to aggregate the one or more values associated with each of the plurality of factors to compute a software module stability score for each of the plurality of, software modules wherein the software module stability score is indicative of stability of the identified test environment for each of the plurality of software modules.
18. The software testing system of claim 17, wherein the processor is further configured to compute a software under test stability score based on the software module stability score of each of the plurality of software modules.
19. The software testing system of claim 12, wherein the processor is further configured to determine a degree of planning of each of the plurality of software under test based on input received from a user, and software module stability score for each of the plurality of software modules associated with the plurality of software under test.
20. The method of claim 19, wherein the processor is further configured to provide one or more recommendations to optimize the degree of planning of each of the plurality of software under test based on the software module stability score for each of the plurality of software modules associated with the plurality of software under test, wherein the one or more recommendations comprises at least one of reduction in resources within the software under test, increase of resources within the software under test, or sharing of resources within the plurality of software under test.
Dated this 30th day of March, 2017
R Ramya Rao
Of K&S Partners
Agent for the Applicant
, Description:TECHNICAL FIELD
The present subject matter is related, in general to software testing and more specifically, but not exclusively to a method and a system to determine an effectiveness index of a software test environment.
| # | Name | Date |
|---|---|---|
| 1 | 201741011326-IntimationOfGrant04-01-2024.pdf | 2024-01-04 |
| 1 | Power of Attorney [30-03-2017(online)].pdf | 2017-03-30 |
| 2 | 201741011326-PatentCertificate04-01-2024.pdf | 2024-01-04 |
| 2 | Form 5 [30-03-2017(online)].pdf | 2017-03-30 |
| 3 | Form 3 [30-03-2017(online)].pdf | 2017-03-30 |
| 3 | 201741011326-FORM 3 [25-04-2023(online)].pdf | 2023-04-25 |
| 4 | Form 18 [30-03-2017(online)].pdf_63.pdf | 2017-03-30 |
| 4 | 201741011326-Written submissions and relevant documents [25-04-2023(online)].pdf | 2023-04-25 |
| 5 | Form 18 [30-03-2017(online)].pdf | 2017-03-30 |
| 5 | 201741011326-AMENDED DOCUMENTS [24-02-2023(online)].pdf | 2023-02-24 |
| 6 | Form 1 [30-03-2017(online)].pdf | 2017-03-30 |
| 6 | 201741011326-Correspondence to notify the Controller [24-02-2023(online)].pdf | 2023-02-24 |
| 7 | Drawing [30-03-2017(online)].pdf | 2017-03-30 |
| 7 | 201741011326-FORM 13 [24-02-2023(online)].pdf | 2023-02-24 |
| 8 | Description(Complete) [30-03-2017(online)].pdf_62.pdf | 2017-03-30 |
| 8 | 201741011326-POA [24-02-2023(online)].pdf | 2023-02-24 |
| 9 | 201741011326-US(14)-HearingNotice-(HearingDate-10-04-2023).pdf | 2023-02-14 |
| 9 | Description(Complete) [30-03-2017(online)].pdf | 2017-03-30 |
| 10 | 201741011326-CLAIMS [04-01-2021(online)].pdf | 2021-01-04 |
| 10 | PROOF OF RIGHT [22-06-2017(online)].pdf | 2017-06-22 |
| 11 | 201741011326-COMPLETE SPECIFICATION [04-01-2021(online)].pdf | 2021-01-04 |
| 11 | Correspondence By Agent_Form1,Form30_23-06-2017.pdf | 2017-06-23 |
| 12 | 201741011326-CORRESPONDENCE [04-01-2021(online)].pdf | 2021-01-04 |
| 12 | 201741011326-REQUEST FOR CERTIFIED COPY [27-07-2017(online)].pdf | 2017-07-27 |
| 13 | 201741011326-DRAWING [04-01-2021(online)].pdf | 2021-01-04 |
| 13 | 201741011326-FER.pdf | 2020-07-15 |
| 14 | 201741011326-FER_SER_REPLY [04-01-2021(online)].pdf | 2021-01-04 |
| 14 | 201741011326-RELEVANT DOCUMENTS [04-01-2021(online)].pdf | 2021-01-04 |
| 15 | 201741011326-FORM 3 [04-01-2021(online)].pdf | 2021-01-04 |
| 15 | 201741011326-PETITION UNDER RULE 137 [04-01-2021(online)].pdf | 2021-01-04 |
| 16 | 201741011326-Information under section 8(2) [04-01-2021(online)].pdf | 2021-01-04 |
| 16 | 201741011326-OTHERS [04-01-2021(online)].pdf | 2021-01-04 |
| 17 | 201741011326-OTHERS [04-01-2021(online)].pdf | 2021-01-04 |
| 17 | 201741011326-Information under section 8(2) [04-01-2021(online)].pdf | 2021-01-04 |
| 18 | 201741011326-FORM 3 [04-01-2021(online)].pdf | 2021-01-04 |
| 18 | 201741011326-PETITION UNDER RULE 137 [04-01-2021(online)].pdf | 2021-01-04 |
| 19 | 201741011326-FER_SER_REPLY [04-01-2021(online)].pdf | 2021-01-04 |
| 19 | 201741011326-RELEVANT DOCUMENTS [04-01-2021(online)].pdf | 2021-01-04 |
| 20 | 201741011326-DRAWING [04-01-2021(online)].pdf | 2021-01-04 |
| 20 | 201741011326-FER.pdf | 2020-07-15 |
| 21 | 201741011326-CORRESPONDENCE [04-01-2021(online)].pdf | 2021-01-04 |
| 21 | 201741011326-REQUEST FOR CERTIFIED COPY [27-07-2017(online)].pdf | 2017-07-27 |
| 22 | 201741011326-COMPLETE SPECIFICATION [04-01-2021(online)].pdf | 2021-01-04 |
| 22 | Correspondence By Agent_Form1,Form30_23-06-2017.pdf | 2017-06-23 |
| 23 | 201741011326-CLAIMS [04-01-2021(online)].pdf | 2021-01-04 |
| 23 | PROOF OF RIGHT [22-06-2017(online)].pdf | 2017-06-22 |
| 24 | Description(Complete) [30-03-2017(online)].pdf | 2017-03-30 |
| 24 | 201741011326-US(14)-HearingNotice-(HearingDate-10-04-2023).pdf | 2023-02-14 |
| 25 | Description(Complete) [30-03-2017(online)].pdf_62.pdf | 2017-03-30 |
| 25 | 201741011326-POA [24-02-2023(online)].pdf | 2023-02-24 |
| 26 | Drawing [30-03-2017(online)].pdf | 2017-03-30 |
| 26 | 201741011326-FORM 13 [24-02-2023(online)].pdf | 2023-02-24 |
| 27 | Form 1 [30-03-2017(online)].pdf | 2017-03-30 |
| 27 | 201741011326-Correspondence to notify the Controller [24-02-2023(online)].pdf | 2023-02-24 |
| 28 | Form 18 [30-03-2017(online)].pdf | 2017-03-30 |
| 28 | 201741011326-AMENDED DOCUMENTS [24-02-2023(online)].pdf | 2023-02-24 |
| 29 | Form 18 [30-03-2017(online)].pdf_63.pdf | 2017-03-30 |
| 29 | 201741011326-Written submissions and relevant documents [25-04-2023(online)].pdf | 2023-04-25 |
| 30 | Form 3 [30-03-2017(online)].pdf | 2017-03-30 |
| 30 | 201741011326-FORM 3 [25-04-2023(online)].pdf | 2023-04-25 |
| 31 | 201741011326-PatentCertificate04-01-2024.pdf | 2024-01-04 |
| 31 | Form 5 [30-03-2017(online)].pdf | 2017-03-30 |
| 32 | 201741011326-IntimationOfGrant04-01-2024.pdf | 2024-01-04 |
| 32 | Power of Attorney [30-03-2017(online)].pdf | 2017-03-30 |
| 1 | 2020-07-1412-07-04E_14-07-2020.pdf |