Abstract: The method and system of present disclosure relate to software testing. In an embodiment, the method includes receiving historical effort data and project complexity data associated with plurality of projects. Further, normalization factors corresponding to the plurality of projects are computed based on sizes of the plurality of projects. Also, a set of user ratings corresponding to a set of predefined parameters are collected for computing a set of weightages for the plurality of projects. Finally, based on the weightages, one or more complexity scale-wise normalization factors for the plurality of projects are identified, thereby determining level of quality assurance for performing the software testing. The method and system disclosed herein facilitate efficient handling of fluctuations and software issues occurring during the software testing of the plurality of projects and reduces various managerial and operational overheads during the software testing. FIG. 3
Claims:WE CLAIM:
1. A method of determining effort for performing software testing, the method comprising:
receiving, by an effort determining system (105),
historical effort data (103), of one or more projects (102), indicating past effort taken during performing one or more phases of Software Development Life Cycle (SDLC) associated with the one or more projects (102), and
project complexity data (104) of the one or more projects (102) across a set of predefined parameters (211) associated with the one or more projects (102), wherein the project complexity data (104) comprises one or more sizes, associated with the one or more projects (102), indicating complexity of the one or more projects (102);
computing, by the effort determining system (105), one or more normalization factors corresponding to the one or more projects (102) based on the one or more sizes of the one or more projects (102);
receiving, by the effort determining system (105), from a user (107), a set of ratings (108) corresponding to the set of predefined parameters (211) for each of the one or more projects (102); and
computing, by the effort determining system (105),
a set of weightages corresponding to the set of predefined parameters (211), based on the set of ratings (108), for each of the one or more projects (102),
one or more complexity scale-wise normalization factors corresponding to the one or more projects (102) by correlating one or more complexity scales corresponding to the one or more projects (102) and the one or more normalization factors, wherein the one or more complexity scales are determined based on the set of ratings (108) and the set of weightages, and
one or more Test Unit Points (TUPs) corresponding to the one or more projects (102) based on the one or more sizes and the one or more complexity scale-wise normalization factors, wherein the one or more TUPs indicates level of quality assurance for performing the software testing.
2. The method as claimed in claim 1, further comprising determining one or more current project efforts (109), using the one or more TUPs, corresponding to the one or more projects (102).
3. The method as claimed in claim 1, further comprising determining a deviation in effort by comparing the one or more current project efforts (109) with one or more actual project efforts, wherein the one or more actual project efforts are determined from the historical effort data (103).
4. The method as claimed in claim 1, wherein the one or more projects (102) are selected, from a plurality of projects, based on maturity of the one or more projects (102), wherein the maturity is determined based on receiving user (107) input in response to a set of predefined factors.
5. The method as claimed in claim 1, wherein the one or more sizes, of the one or more projects (102), is categorized into at least one of small, medium, large, and extra-large category.
6. The method as claimed in claim 1, wherein the set of predefined parameters (211) comprises at least one of number of third party interfaces, one or more skills, one or more technologies, one or more computing platforms, number of impacting modules associated with the one or more projects (102), reusability percentage of test cases, automation percentage of test cases, and requirement percentage volatility.
7. The method as claimed in claim 1, further comprising enabling the user (107) to dynamically change the set of predefined parameters (211).
8. An effort determining system (105) for determining effort for performing software testing, the system comprising:
a processor (203); and
a memory communicatively coupled to the processor (203), wherein the memory stores processor-executable instructions, which, on execution, causes the processor (203) to:
receive, historical effort data (103), of one or more projects (102), indicating past effort taken during performing one or more phases of Software Development Life Cycle (SDLC) associated with the one or more projects (102),
and project complexity data (104) of the one or more projects (102) across a set of predefined parameters (211) associated with the one or more projects (102), wherein the project complexity data (104) comprises one or more sizes, associated with the one or more projects (102), indicating complexity of the one or more projects (102);
compute one or more normalization factors corresponding to the one or more projects (102) based on the one or more sizes of the one or more projects (102);
receive, from a user (107), a set of ratings (108) corresponding to the set of predefined parameters (211) for each of the one or more projects (102); and
compute, a set of weightages corresponding to the set of predefined parameters (211), based on the set of ratings (108), for each of the one or more projects (102),
one or more complexity scale-wise normalization factors corresponding to the one or more projects (102) by correlating one or more complexity scales corresponding to the one or more projects (102) and the one or more normalization factors, wherein the one or more complexity scales are determined based on the set of ratings (108) and the set of weightages, and
one or more Test Unit Points (TUPs) corresponding to the one or more projects (102) based on the one or more sizes and the one or more complexity scale-wise normalization factors, wherein the one or more TUPs indicates level of quality assurance for performing the software testing.
9. The effort determining system (105) as claimed in claim 8, wherein the processor (203) is further configured to determine one or more current project efforts (109), using the one or more TUPs, corresponding to the one or more projects (102).
10. The effort determining system (105) as claimed in claim 8, wherein the processor (203) is further configured to determine a deviation in effort by comparing the one or more current project efforts (109) with one or more actual project efforts, wherein the one or more actual project efforts are determined from the historical effort data (103).
11. The effort determining system (105) as claimed in claim 8, wherein the one or more projects (102) are selected, from a plurality of projects (102), based on maturity of the one or more projects (102), wherein the maturity is determined based on receiving user (107) input in response to a set of predefined factors.
12. The effort determining system (105) as claimed in claim 8, wherein the one or more sizes, of the one or more projects (102), is categorized into at least one of small, medium, large, and extra-large category.
13. The effort determining system (105) as claimed in claim 8, wherein the set of predefined parameters (211) comprises at least one of number of third party interfaces, one or more skills, one or more technologies, one or more computing platforms, number of impacting modules associated with the one or more projects (102), reusability percentage of test cases, automation percentage of test cases, and requirement percentage volatility.
14. The effort determining system (105) as claimed in claim 8, wherein the processor (203) is further configured to enable the user (107) to dynamically change the set of predefined parameters (211).
Dated this 7th day of March, 2017
SWETHA S. N
OF K&S PARTNERS
AGENT FOR THE APPLICANT
, Description:TECHNICAL FIELD
The present subject matter is related, in general to software testing and more particularly, but not exclusively to a method and system for determining effort for performing software testing.