Abstract: A system, medium and method for automatically generating test data to be applied to test a target software code is disclosed. Input parameter data is received from a user via a displayed user interface, wherein the input parameter data is directed to a user selected data type, the data type being a Boolean, string, or integer. One or more pre-established stored testing algorithms is automatically selected based on the user selected data type and one or more values are applied to the selected one or more pre-established stored testing algorithms in accordance with user selected data type. At least one set of test data from the one or more identified applicable testing algorithms is automatically generated, wherein the at least one set of test data generated from the identified testing algorithms can be used as inputs for testing the target software code. FIG. 1
CLIAMS:We claim:
1. A method for automatically generating test data to be applied to test a target software code, the method comprising:
receiving, with a testing apparatus, input parameter data from a user via a displayed user interface, wherein the input parameter data is directed to a user selected data type, the data type being a Boolean, string, or integer;
selecting, with the testing apparatus, one or more preestablished stored testing algorithms solely based on the user selected data type;
applying one or more values to the selected one or more preestablished stored testing algorithms in accordance with user selected data type; and
automatically generating, with the testing apparatus, at least one set of test data from the one or more identified applicable testing algorithms, wherein the at least one set of test data generated from the identified testing algorithms can be used as inputs for testing the target software code.
2. The method of claim 1, wherein the one or more preestablished testing algorithms is associated with software engineering based algorithms.
3. The method of claim 1, wherein the one or more preestablished testing algorithms is associated with programming language and construct based algorithms.
4. The method of claim 1, wherein the one or more preestablished testing algorithms is associated with application domain and business case based algorithms.
5. The method of claim 1, wherein the at least one set of test data generated by the testing apparatus includes a plurality of sets of test data, the method further comprising:
performing a priority based parsing technique on the plurality of sets of test data generated, wherein the parsing technique is associated with the received input configuration data; and
generating a reduced set of test data based from the parsing technique, wherein the reduced set of test data can be used to test the target software code.
6. The method of claim 5 wherein the input parameter data utilized in the parsing technique identifies whether a particular input parameter is mandatory or optional.
7. The method of claim 5 wherein the input parameter data utilized in the parsing technique indicates that a particular input parameter is to have a required value to be included or removed from the generated sets of testing data.
8. A non-transitory machine readable medium having stored thereon instructions for automatically generating test data to be applied to test a target software code, the medium comprising machine executable code which when executed by a processor of a testing apparatus, causes the processor to perform steps comprising:
receiving input parameter data from a user via a displayed user interface, wherein the input parameter data is directed to a user selected data type, the data type being a Boolean, string, or integer;
selecting one or more preestablished stored testing algorithms solely based on the user selected data type;
applying one or more values to the selected one or more preestablished stored testing algorithms in accordance with user selected data type; and
automatically generating at least one set of test data from the one or more identified applicable testing algorithms, wherein the at least one set of test data generated from the identified testing algorithms can be used as inputs for testing the target software code.
9. The medium of claim 8, wherein the one or more preestablished testing algorithms is associated with software engineering based algorithms.
10. The medium of claim 8, wherein the one or more preestablished testing algorithms is associated with programming language and construct based algorithms.
11. The medium of claim 8, wherein the one or more preestablished testing algorithms is associated with application domain and business case based algorithms.
12. The medium of claim 8, wherein the at least one set of test data generated by the processor of the testing apparatus includes a plurality of sets of test data, the method performed by the processor further comprising:
performing a priority based parsing technique on the plurality of sets of test data generated, wherein the parsing technique is associated with the received input parameter data; and
generating a reduced set of test data based from the parsing technique, wherein the reduced set of test data can be used to test the target software code.
13. The medium of claim 12, wherein the input parameter data utilized in the parsing technique identifies whether a particular input parameter is mandatory or optional.
14. The medium of claim 12, wherein the input parameter data utilized in the parsing technique indicates that a particular input parameter is to have a required value to be included or removed from the generated sets of testing data.
15. A computer based testing apparatus configured to automatically generate test data to be applied to test a target software code, the apparatus comprising:
a memory having stored thereon executable programmed instructions for automatically generating test data to be applied to test a target software code; and
a processor coupled to the memory and configured to execute the programmed instructions which, when executed by the processor, causes the processor to perform steps comprising:
receiving input parameter data from a user via a displayed user interface, wherein the input parameter data is directed to a user selected data type, the data type being a Boolean, string, or integer;
selecting one or more preestablished stored testing algorithms solely based on the user selected data type;
applying one or more values to the selected one or more preestablished stored testing algorithms in accordance with user selected data type; and
automatically generating at least one set of test data from the one or more identified applicable testing algorithms, wherein the at least one set of test data generated from the identified testing algorithms can be used as inputs for testing the target software code.
16. The apparatus of claim 15, wherein the one or more preestablished testing algorithms is associated with software engineering based algorithms.
17. The apparatus of claim 15, wherein the one or more preestablished testing algorithms is associated with programming language and construct based algorithms.
18. The apparatus of claim 15, wherein the one or more preestablished testing algorithms is associated with application domain and business case based algorithms.
19. The apparatus of claim 15, wherein the at least one set of test data generated by the processor of the testing apparatus includes a plurality of sets of test data, the method performed by the processor further comprising:
performing a priority based parsing technique on the plurality of sets of test data generated, wherein the parsing technique is associated with the received input parameter data; and
generating a reduced set of test data based from the parsing technique, wherein the reduced set of test data can be used to test the target software code.
20. The apparatus of claim 19, wherein the input parameter data utilized in the parsing technique identifies whether a particular input parameter is mandatory or optional.
21. The apparatus of claim 19, wherein the input parameter data utilized in the parsing technique indicates that a particular input parameter is to have a required value to be included or removed from the generated sets of testing data.
Dated this 11th day of June, 2013
MADHUSUDAN S.T.
OF K & S PARTNERS
ATTORNEY FOR THE APPLICANTS
,TagSPECI:This technology generally relates to a system, non-transitory computer medium and method for test data generation and optimization for data driven testing.
| # | Name | Date |
|---|---|---|
| 1 | 2540-CHE-2013 FORM-9 11-06-2013.pdf | 2013-06-11 |
| 1 | 2540-CHE-2013-IntimationOfGrant28-02-2023.pdf | 2023-02-28 |
| 2 | IP23992-Spec.pdf | 2013-06-15 |
| 2 | 2540-CHE-2013-PatentCertificate28-02-2023.pdf | 2023-02-28 |
| 3 | IP23992-Drawings.pdf | 2013-06-15 |
| 3 | 2540-CHE-2013-AMMENDED DOCUMENTS [25-01-2023(online)].pdf | 2023-01-25 |
| 4 | FORM 5.pdf | 2013-06-15 |
| 4 | 2540-CHE-2013-FORM 13 [25-01-2023(online)].pdf | 2023-01-25 |
| 5 | FORM 3.pdf | 2013-06-15 |
| 5 | 2540-CHE-2013-Written submissions and relevant documents [25-01-2023(online)].pdf | 2023-01-25 |
| 6 | abstract2540-CHE-2013.jpg | 2013-06-20 |
| 6 | 2540-CHE-2013-AMENDED DOCUMENTS [24-12-2022(online)].pdf | 2022-12-24 |
| 7 | 2540-CHE-2013-Correspondence to notify the Controller [24-12-2022(online)].pdf | 2022-12-24 |
| 7 | 2540-CHE-2013 FORM-18 19-07-2013.pdf | 2013-07-19 |
| 8 | 2540-CHE-2013-FORM 13 [24-12-2022(online)].pdf | 2022-12-24 |
| 8 | 2540-CHE-2013 CORRESPONDENCE OTHERS 19-07-2013.pdf | 2013-07-19 |
| 9 | 2540-CHE-2013-POA [24-12-2022(online)].pdf | 2022-12-24 |
| 9 | 2540-CHE-2013 FORM-3 29-08-2013.pdf | 2013-08-29 |
| 10 | 2540-CHE-2013-FER.pdf | 2019-03-29 |
| 10 | 2540-CHE-2013-US(14)-HearingNotice-(HearingDate-11-01-2023).pdf | 2022-11-30 |
| 11 | 2540-CHE-2013-FER_SER_REPLY [30-09-2019(online)].pdf | 2019-09-30 |
| 11 | 2540-CHE-2013-FORM 3 [30-09-2019(online)].pdf | 2019-09-30 |
| 12 | 2540-CHE-2013-FER_SER_REPLY [30-09-2019(online)].pdf | 2019-09-30 |
| 12 | 2540-CHE-2013-FORM 3 [30-09-2019(online)].pdf | 2019-09-30 |
| 13 | 2540-CHE-2013-FER.pdf | 2019-03-29 |
| 13 | 2540-CHE-2013-US(14)-HearingNotice-(HearingDate-11-01-2023).pdf | 2022-11-30 |
| 14 | 2540-CHE-2013 FORM-3 29-08-2013.pdf | 2013-08-29 |
| 14 | 2540-CHE-2013-POA [24-12-2022(online)].pdf | 2022-12-24 |
| 15 | 2540-CHE-2013 CORRESPONDENCE OTHERS 19-07-2013.pdf | 2013-07-19 |
| 15 | 2540-CHE-2013-FORM 13 [24-12-2022(online)].pdf | 2022-12-24 |
| 16 | 2540-CHE-2013 FORM-18 19-07-2013.pdf | 2013-07-19 |
| 16 | 2540-CHE-2013-Correspondence to notify the Controller [24-12-2022(online)].pdf | 2022-12-24 |
| 17 | 2540-CHE-2013-AMENDED DOCUMENTS [24-12-2022(online)].pdf | 2022-12-24 |
| 17 | abstract2540-CHE-2013.jpg | 2013-06-20 |
| 18 | 2540-CHE-2013-Written submissions and relevant documents [25-01-2023(online)].pdf | 2023-01-25 |
| 18 | FORM 3.pdf | 2013-06-15 |
| 19 | FORM 5.pdf | 2013-06-15 |
| 19 | 2540-CHE-2013-FORM 13 [25-01-2023(online)].pdf | 2023-01-25 |
| 20 | IP23992-Drawings.pdf | 2013-06-15 |
| 20 | 2540-CHE-2013-AMMENDED DOCUMENTS [25-01-2023(online)].pdf | 2023-01-25 |
| 21 | IP23992-Spec.pdf | 2013-06-15 |
| 21 | 2540-CHE-2013-PatentCertificate28-02-2023.pdf | 2023-02-28 |
| 22 | 2540-CHE-2013-IntimationOfGrant28-02-2023.pdf | 2023-02-28 |
| 22 | 2540-CHE-2013 FORM-9 11-06-2013.pdf | 2013-06-11 |
| 1 | search_29-03-2019.pdf |