Abstract: The present subject matter discloses a method for performing software testing based on Multiple Perspective Feature model (MPFM). The method includes generating a Feature Model (FM) associated with the Software Product Line, where the FM includes a plurality of features. Further, separation of concerns (SoC) is achieved in the FM based on identifying at least one source of variation with a common cause of variation. Furthermore, the common cause of variation was determined as a perspective of the FM to generate a Multiple Perspective Feature Model (MPFM), where each perspective from amongst a plurality of perspectives of the MPFM includes at least one feature from amongst the plurality of features. Further, the method includes identification of test cases based on the plurality of perspectives and the plurality of features of the MPFM, where the plurality of perspectives are parameters and the plurality of features are values
FORM 2
THE PATENTS ACT, 1970 (39 of 1970) & THE PATENTS RULES, 2003
COMPLETE SPECIFICATION (See section 10, rule 13) 1. Title of the invention: FEATURE MODEL BASED TESTING
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor, Nariman
SERVICES LIMITED Point, Mumbai, Maharashtra 400021,
India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it
is to be performed.
TECHNICAL FIELD
[0001] The present subject matter relates, in general, to testing of software product(s) and
particularly to testing of software product(s) based on a Feature Model (FM).
BACKGROUND
[0002] Software systems are continuously modified during software development process
for reasons, such as, correction of errors, addition of new features, porting to new environments, and improvement of performance. The development of Software Product Lines (SPL) are emerging as a viable and important development paradigm allowing companies to realize order-of-magnitude improvements in time to market, cost, productivity, quality, reliability and other business drivers. Further, SPL allows software customization based on a user or an organization's preferences.
[0003] Within SPL, features of the SPL play an important role in specifying the fixed and
variable parts of the architectures of product families and configurable systems. Typically, a Feature Model (FM), which is a representation of the features of the products of the SPL, is generated to determine the valid feature combinations for testing.
[0004] Changes made to the software systems are tested to ensure that the software
systems behave as defined and that the modifications have not had an adverse impact on the performance of the software. Software testing is an important part in the life cycle of any software system. During software testing, the developed software is tested to determine if the developed software meets the technical requirements which define its development and design.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] The detailed description is described with reference to the accompanying figures.
In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
[0006] Fig. 1 illustrates components of a software testing system, in accordance with an
embodiment of the present subject matter.
[0007] Fig. 2 illustrates a method to perform software testing based on a Multiple
Perspective Feature Model (MPFM), in accordance with an embodiment of the present subject matter.
[0008] It should be appreciated by those skilled in the art that any block diagrams herein
represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0009] Method(s) and system(s) to perform software testing based on a Multiple
Perspective Feature Model (MPFM) are described. Methods can be implemented in systems that include, but are not limited to, computing devices, such as, desktop computers, hand-held devices, laptops or other portable computers, advanced cellular phones, tablets, notebooks, and the like, capable of establishing testing software applications. Although the description herein is with reference to computers, the methods and systems may be implemented in other computing devices and systems as well, albeit with a few variations, as will be understood by a person skilled in the art.
[0010] The increasingly competitive business environment of today has made it
imperative to run business in a cost-effective and customer centric manner. Software customization has occurred in order to achieve cost cutting and the target to adapt to customer requirements has led to customization of softwares. The target to reduce costs, resource dependency and increased reliability has led to the development of Software Product Line (SPL), the ultimate goal being, to develop software that is in line with the requisites of a user or an organization. For example, Enterprise Resource Planning (ERP) packages that integrate information across entire organization are customized for every organization. Further, internationalized softwares are customized for every country to suit to the requisites of the
organization in respective countries. Furthermore, consumer software products are personalized as per user preferences. Although the basic functionality of the customized packages is similar to that of a core package, the number of variations and dimensions of variation in customized packages tend to be high
[0011] Since the software product is customizable, there is a possibility that multiple
variations in working of the SPL due to association among features of the software may occur. Managing variability in the development of the SPL contributes to the key success of the developed software. Feature Modeling is one popular technique to model variability in the SPL. Feature Modeling is a tool to develop Feature Models (FMs) where the developed FM's represent a product in a SPL by means of features. Generally, inclusion of feature(s) in the SPL results in interactions among the features, thus varying the behavior of other features. Also, in a real world system, developing a FM for complex systems, such as the entire SPL, is a tedious and a cumbersome task.
[0012] More often than not, SPLs once developed, are tested to validate the correctness
and to ensure that a desired output is generated. This implies that when a new feature is included
or changed, it has to be ensured that the new feature works in all configurations in conjugation
with all the other variations that each configuration might have. In an illustrative example,
inclusion of a feature in like currency in an internationalized website can affect several other
features like Stock Quote, Gold Rate Chart, etc., which are affected by the change in currency.
Also, the new feature has to be tested in combination with other features in different browsers
like Internet Explorer, Google Chrome, Firefox, different Operating Systems (OS), such as
Windows, Linux, Android, and different devices, such as Desktop, Tablet and Smartphone. The
feature interactions in FM are highly convoluted in a real world SPL and can prove to be a
formidable task, where the number of variations and the configurations is large.
[0013] Also, since the number of feature interactions is wide, it may not be possible to
test all the feature interactions in a limited time period. The manual process of identifying feature interactions and testing the identified feature interactions pose multiple issues, such as, inability to achieve coverage, difficulty in prioritizing, dependency on experienced testers and difficulty in measuring test coverage. Further, a distributed nature of teams in software service industry pose additional challenges, such as lack of access to software artifact, difficulty in analyzing change impact, and insufficient domain understanding. As variability is geared more towards
software, and as more features are being included within a single product line, testing of current traditional feature modeling is difficult to implement.
[0014] According to an implementation of the present subject matter, methods and
systems to perform software testing based on a Multiple Perspective Feature Model (MPFM) are described. The present subject matter relates to modularization of the FM by performing Separation of Concerns (SoC) in the FM to obtain the MPFM. In operation, according to an implementation of the present subject matter, the Feature Model (FM) associated with a SPL is generated, wherein the FM is representative of perspectives, features and sub-features of the SPL. In said implementation, the generation of a FM may be achieved by means of tools existing in the state of the art. For example, Feature modeler.
[0015] The generated FM associated with the SPL may have variability information
scattered all over the FM. Therefore, Separation of Concerns (SoC) in the FM associated with
the SPL is performed, in order to modularize the FM. In one implementation, the SoC in the FM
is based on identifying a source of variation with a common cause of variation. In said
implementation, the features that are sources of variations may have a common cause of
variation. For example, a FM for an internationalized website may include Stock Quote and Gold
rate chart as features under a perspective domain. The FM may also include Indian Rupee, Euro
and Pound as sub-features of the features Stock Quote and Gold rate chart. In such a scenario,
the features like Stock Quote and Gold rate chart vary due to the sub-features Indian Rupee, Euro
and Pound in the website. Therefore, in order to modularize the FM, SoC is performed, where,
"currency" may be identified as a common cause of variation for the features Stock Quote and
Gold rate chart and; Indian Rupee, Euro and Pound may be identified as sources of variation.
[0016] In one implementation of the present subject matter, the determined common
cause of variation in the FM may be identified as perspective. For instance, in the above described example, the common cause of variation ‘currency’ may be identified as a new perspective of the FM. Since the SPL may implement a plurality of such perspectives where multiple new perspectives may be identified base on SoC, the generated FM is referred to as a MPFM. In one implementation, each identified perspective is further modeled to one or more feature.
[0017] Furthermore, test cases may also be identified based on the perspectives and the
features of the MPFM. In one implementation, the identification of test cases is based on
perspective selection parameters. For example, let us consider that a MPFM includes perspectives, such as domain, internationalization, architecture and operating environment and features, such as Weather Update, Celsius, Blogs, Articles, 3-column page layout, 2-column page layout, Internet Explorer, Windows, Linux and Firefox. If test case for a user, planning on testing Weather Update in Celsius, in a 3-column page layout, on an Internet Explorer Browser, on a device running on a Windows Operating System has to be generated, the test case can be identified as [Weather Update, Celsius, 3-column, Internet Explorer, Windows]. Such a selection of test cases based on perspectives and features may allow effective testing of the SPL.
[0018] The implementation of the described system and methods of the present subject
matter may provide an alternative way to model variations and measure test coverage. The present subject matter leverages the basic technology of modularization of redundant features by creating a MPFM, thus creating a model that is modular and is easy to maintain. Further, since the existing method relies on the use of MPFM in testing by generating a pair-wise combination, an exhaustive coverage of possible contexts and notion of complete testing done is available. Furthermore, prioritization of the test cases helps the software testers to sort the features based on their importance and helps the software testers to arrive at a minimum set of tests that provide maximum coverage of variability, thereby reducing the effort.
[0019] It should be noted that the description merely illustrates the implementation of
principles of the present subject matter. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present subject matter and are included within its spirit and scope. Furthermore, all examples recited herein are principally intended expressly to be only for pedagogical purposes to aid the reader in understanding the principles of the present subject matter and the concepts contributed by the inventor(s) to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the present subject matter, as well as specific examples thereof, are intended to encompass equivalents thereof.
[0020] The manner in which the systems and methods shall be implemented has been
explained in details with respect to the Fig. 1 and 2. While aspects of described systems and methods can be implemented in any number of different computing systems, transmission
environments, and/or configurations, the embodiments are described in the context of the following system(s).
[0021] Fig. 1 illustrates components of a software testing system 102, in accordance with
an embodiment of the present subject matter. In one implementation, the enterprise system 102 includes processor(s) 104. The processor 104 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) is to fetch and execute computer-readable instructions stored in the memory.
[0022] In another embodiment of the present subject matter, the software testing system
102 may also include a memory 108. The memory 108 may be coupled to the processor 104. The memory 108 can include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0023] Further, the software testing system 102 may include module(s) 110 and data 112.
The modules 110 and the data 112 may be coupled to the processors 104. The modules 110, amongst other things, include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. The modules 110 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
[0024] In an implementation, the module(s) 110 include a modeling module 114, a
modularizing module 116, a validating module 118, a testing module 120, ranking module 122 and other module(s) 124. The other module(s) 124 may include programs or coded instructions that supplement applications or functions performed by the software testing system 102. In said implementation, the data 112 includes modular data 126, test data 128, and other data 130. The other data 130 amongst other things, may serve as a repository for storing data that is processed, received, or generated as a result of the execution of one or more modules in the module(s). Although the data is shown internal to the software testing system 102, it may be understood that the data 112 can reside in an external repository (not shown in the figure), which may be coupled
to the software testing system 102. The software testing system 102 may communicate with the external repository through the interface(s) 106 to obtain information from the data 112.
[0025] The software testing system 102 facilitates software application testing based on
Multiple Perspective Feature Model (MPFM). In an implementation, the modeling module 114 of the software testing system 102 is to generate a Feature Model (FM) associated with a Software product Line (SPL), where the generated FM is representative of a plurality of perspectives, features and sub-features. In said implementation, the generation of the FM may be performed by existing tools available in the state of the art, such as a feature modeler. In an illustrative example, the FM for an internationalized website (Site X) may contain a domain perspective. The domain can further include features like Stock Quote and Gold Rate chart and each of these features can include further sub-features like Indian Rupee, Euro, Dollar and Pound. In another example, the FM for a website (Site Z) may contain Operating Environment as a perspective. The Operating Environment may further include features like browser, Operating System (OS) and device. Each of these features may include further sub-features, such as the browser may include sub-features like Internet Explorer (IE), Opera, Firefox and different configurations of each the listed sub-features like IE 7, IE 8 and so on. The OS may further include sub-features like Windows, Linux and Android.
[0026] According to an implementation of the present subject matter, the modularizing
module 116 of the software testing system 102 may modularize the generated FM by performing Separation of Concerns (SoC) in the FM. In said implementation, the SoC may be performed based on identifying a common cause of variation for a source of variation. In said implementation, the source of variation is considered as a relationship between a source feature and a target feature, such that, the source feature is the feature of the perspective that is implemented with many variations and the target feature is variations of the source feature.
[0027] In continuation with the previous example on the FM for the internationalized
website (Site X), where Indian Rupee, Euro, Dollar and Pound are representative sub-features of the features Stock Quote and Gold Rate Chart, the features may be referred to as the source features, while the sub-features may be referred to as the target features. The source of variation here is, the relationship between source features, i.e., Stock Quote and the target features i.e., Indian Rupee, Euro, Dollar and Pound. Similarly, another source of variation is the relationship
between source features i.e., Gold Rate Chart and the target features i.e., Indian Rupee, Euro, Dollar and Pound. In the described scenario, the software testing system 102 may further modularize the generated FM, by generating 'currency' to be the common cause of variation.
[0028] In another implementation, the modularizing module 116 of the software testing
system 102 may determine the common cause of variation as a perspective to generate a Multiple Perspective Feature Model (MPFM). In said implementation, each perspective of the MPFM includes one or more features. In one implementation, the MPFM thus generated is stored in the modular data 124. In continuation with the previous example, 'currency', which is identified as the common cause of variation, is determined as a perspective of the FM for site X by the modularizing module 116. The perspective currency may further be linked to the sub-features Indian Rupee, Euro, Dollar and Pound.
[0029] Further, test cases are identified based on utilizing perspectives as parameters and
one or more features from the perspectives as values. In one implementation the validating module 118 may identify test cases and validate the identified test cases and store the validated test cases in the test data 126. In said implementation, the identification of the test cases is based on perspective selection parameters. Consider an illustrative example, where a MPFM of a SPL consists of perspectives like Operating System, Browser, Protocol, Central Processing Unit (CPU) and Data Base Management Systems (DBMS) and each of these perspectives include one or more features as listed below. The validating module 118 may identify test cases based on utilizing perspectives as parameters and features as values. The various perspectives and features under each of the perspectives may include operating system (Windows, iOS, Android), a browser (Internet Explorer, Firefox), protocol stack (IPv4, IPv6), a processor (Intel, AMD), and a database (MySQL, Sybase, Oracle). 10 possible test cases that may cover all combinations of test cases may be identified and the cases are thus listed below in Table 1
Test case Operating System Browser Protocol CPU DBMS
1 Windows Internet Explorer IPv4 Intel MySQL
2 Windows Firefox IPv6 AMD Sybase
3 Windows Internet Explorer IPv6 Intel Oracle
4 iOS Firefox IPv4 AMD MySQL
5 iOS Internet Explorer IPv4 Intel Sybase
6 iOS Firefox IPv4 Intel Oracle
7 Android Internet Explorer IPv6 AMD MySQL
8 Android Firefox IPv4 Intel Sybase
9 Android Firefox IPv4 AMD Oracle
10 iOS Firefox IPv6 AMD Oracle
Table 1: Test cases
[0030] Further, the test cases once identified may be validated by the validating module
118 of the software testing system 102, where every feature of the identified test case may be compared to features of a product configuration to generate constraints such that only valid test cases are identified. An exact matching of the features of the test cases to features of the product configuration is indicative of a valid test case. In an example, let us consider test case 1 as listed in Table 1. The test case is identified to be [Windows, Internet Explorer, IPv4, Intel, and MySQL]. Now, each feature of the test case namely, Windows, Internet Explorer, IPv4, Intel, MySQL, is mapped to the feature of the product configuration. The existence of the features in the product configuration is indicative of a valid test case and further steps can be taken to ensure that all possible pair-wise combinations of features in the validated test case are tested. However, if the features of the identified test case do not exist in the product configuration, the test case is determined to be invalid and further steps to perform testing of such an invalid test case are aborted. For example, in a situation where the Android Operating System and iOS in the product configuration does not support Internet Explorer browser, the test cases 5 and 7 may be discarded as these test cases are deemed to be invalid.
[0031] Further, a combinatorial test of the validated test cases may be performed,
wherein the combinatorial test includes generation of pair-wise combination of features of the validated test cases to ensure that all possible combinations in the validated test case are tested. In one implementation, the testing module 120 of the software testing system 102 may test feature interactions by generating combinations of the features of the MPFM. In said implementation, the generated combination is a pair-wise combination and the generated combinations are tested by passing each perspective of the MPFM as a parameter and each feature of the perspective of the MPFM is passed as a value. In continuation with the previous example, if all the 10 test cases listed in Table 1 are tested, it can be ensured that a total of 3 x 2x 2 x 2 x 3 = 72 possible pair-wise combinations thus generated are tested.
[0032] Furthermore, the validated test cases may be prioritized based on statistical
parameters. In one implementation, the ranking module 122 of the software testing system 102 may prioritize test cases based on statistical parameters. In said implementation, the statistical parameters include either probability of error and probability of usage. In an illustrative example, a social networking application might have to be tested in different browsers like Internet Explorer, Firefox, Google Chrome, and different configurations of each of the browsers, devices like tablets, desktop, and the like, different screens with different resolutions, different page layouts, different portals, languages, time zones and so on and so forth. It may not be practically possible to perform testing under all conditions. Under such scenarios, it may be helpful for a software tester to prioritize the test cases.
[0033] In said implementation, testing based on the probability of usage, suggests
prioritization of testing those features that are widely used by people of a particular geographic location prior to testing of the less used features,. For example, consider a scenario where a particular social networking application has to be tested in different Operating Systems like Windows Operating System, Blackberry Operating System, Android Operating System and iOS. If this particular application is to be launched in a country where the usage of Windows Operating System and Android Operating System is wide, the features of the application are first tested in Windows Operating System and Android Operating System prior to testing the features of the application on the Blackberry Operating System and iOS.
[0034] In another implementation, prioritizing based on the probability of error, suggests
prioritization of testing of those features that have failed to generate the desired output prior to testing of those features that have generated an expected output, based on performance history of the features in past test cases. For example, consider a feature like the ability to open multiple tabs in a browser. If this particular feature has to be tested in different browsers like Internet Explorer, Firefox, Chrome, etc, it is first tested on those browsers where the ability to open multiple tabs has failed repeatedly in the past test cases.
[0035] Further, the prioritized test cases are tested to determine if the developed SPL
generates the desired output. In one implementation, the testing module 120 may test the prioritized test cases based on combinatorial testing method to determine the success of the developed SPL.
[0036] Fig. 2 illustrates a method 200 to generate a Multiple Perspective Feature Model
(MPFM) and utilize the MPFM in software application testing, according to an embodiment of the present subject matter. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or any alternative methods. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware.
[0037] The method may be described in the general context of computer executable
instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0038] A person skilled in the art will readily recognize that steps of the method can be
performed by programmed computers. Herein, some embodiments are also intended to cover program storage devices, for example, digital data storage media, which are machine or computer readable and encode machine-executable or computer-executable programs of instructions, where said instructions perform some or all of the steps of the described method. The program storage devices may be, for example, digital memories, magnetic storage media, such as a magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media. The embodiments are also intended to cover both communication network and communication devices to perform said steps of the described method.
[0039] Referring to Fig. 2, at block 202, a Feature Model (FM) associated with a
Software Product Line (SPL) is generated, wherein the FM is a representative of the features of the SPL. In one implementation of the present subject matter, the FM is generated by means of tools existing in the state of the art. In another implementation, the Modeling Module 114 is to generate a FM based on the features of a SPL.
[0040] At block 204, Separation of Concerns (SoC) is performed in the FM, wherein the
SoC is performed by identifying at least one source of variation with a common cause of variation. In one implementation, the source of variation is identified as those features that may be implemented with several variations. In the said implementation, the sources of variation that have a common cause of variation are separated out as concerns. In another implementation, the modularizing module 116 may perform SoC in the FM to generate a model that is modular and easy to maintain.
[0041] At block 206, the common cause of variation that was identified as a concern is
determined to be a perspective. Since several such concerns identified in a real world system for a SPL, several perspectives are determined and the model thus generated is a Multiple Perspective Feature Model (MPFM). In one implementation, each perspective of the MPFM further includes at least one feature. In said implementation, the modularizing module 116 may generate a MPFM from a FM associated with a SPL.
[0042] At block 208, test cases are identified based on the perspectives and one or more
features of the MPFM and the identified test cases, where the identification of the test cases is based on utilizing perspectives as parameters and features from the perspective are utilized as values. In said implementation, the test cases may be identifies based on perspective selection parameters and the identified test cases are further validated based on comparing the features of the identified test cases to features of the product configuration. Furthermore, existence of the features of the identified test cases may be verified based on a comparision to the features of the product configuration to determine the validity of the identified test cases.
[0043] Although embodiments for methods and systems to perform software testing
based on the MPFM are described, it is to be understood that the present subject matter is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as embodiments to perform software testing based on the MPFM.
I/We claim:
1. A method for testing a Software Product Line (SPL), the method comprising:
generating a Feature Model (FM) associated with the SPL, wherein the FM includes a plurality of features;
achieving a separation of concerns (SoC) in the FM based on identifying at least one source of variation with a common cause of variation, wherein the achieving is performed by a processor (104);
determining the common cause of variation as a perspective of the FM to generate a Multiple Perspective Feature Model (MPFM), wherein each perspective from amongst a plurality of perspectives of the MPFM includes at least one feature from amongst the plurality of features, and wherein the MPFM is stored in modular data (126); and
identifying test cases based on the plurality of perspectives and the plurality of features of the MPFM, wherein the plurality of perspectives are utilized as parameters and the plurality of features are utilized as values, and wherein the identified test cases are stored in test data (128).
2. The method as claimed in claim 1, wherein the method further comprising validating the identified test cases to generate validated test cases, wherein the validated test cases represent a valid set of features.
3. The method as claimed in claim 2, wherein the validating comprises:
comparing features of the identified test cases to features of a product configuration; and
verifying existence of the features of the identified test cases in the features of the product configuration to validate the identified test cases, wherein an existence of the features is indicative of a valid test case.
4. The method as claimed in claim 2, wherein the method further comprises prioritizing the
validated test cases based on statistical parameters.
5. The method as claimed in claim 4, wherein the statistical parameters include one of probability of error and probability of usage.
6. The method as claimed in claim 4, further comprising testing the prioritized test cases based on combinatorial testing method.
7. A Software Testing System (102) for testing a SPL comprising:
a processor (104);
a modeling module (114) coupled to the processor (104), to generate a FM associated with the SPL, wherein the FM includes a plurality of features; a modularizing module (116) coupled to the processor (104) to:
achieve separation of concerns (SoC) in the FM based on identifying at least one source of variation with a common cause of variation; and
determine the common cause of variation as a perspective of the FM to generate a Multiple Perspective Feature Model (MPFM), wherein each perspective from amongst a plurality of perspectives of the MPFM includes at least one feature from amongst the plurality of features; and
a validating module (118) coupled to the processor (104), to identify test cases based on the plurality of perspectives and the plurality of features of the MPFM, wherein the plurality of perspectives are utilized as parameters and the plurality of features are utilized as values.
8. The Software Testing System (102) as claimed in claim 7, wherein the validating module
(118) further validates the identified test cases to generate validated test cases, and wherein the validated test cases represent a valid set of features.
9. The Software Testing System (102) as claimed in claim 8, wherein the validating module
(118) validates the identified test cases based on:
comparing features of the identified test cases to features of a product configuration; and
verifying existence of the features of the identified test cases in the features of the product configuration to validate the test cases, wherein an existence of the features is indicative of a valid test case.
10. The Software Testing System (102) as claimed in claim 7, further comprising a ranking module (122) coupled to the processor (104), to prioritize the validated test cases based on statistical parameters.
11. The Software Testing System (102) as claimed in claim 10, wherein the statistical parameters include one of probability of error and probability of usage.
12. The Software Testing System (102) as claimed in claim 10, further comprising a testing module (120) coupled to the processor (104) to test the prioritized test cases based on combinatorial testing method.
13. A non-transitory computer readable medium having a set of computer readable instructions that, when executed, cause a computing system to:
generate a Feature Model (FM) associated with the Software Product Line, wherein the FM includes a plurality of features;
achieve separation of concerns (SoC) in the FM based on identifying at least one source of variation with a common cause of variation;
determine the common cause of variation as a perspective of the FM to generate a
Multiple Perspective Feature Model (MPFM), wherein each perspective from amongst a plurality of perspectives of the MPFM includes at least one feature from amongst the plurality of features; and
identify test cases based on the plurality of perspectives and the plurality of features of the MPFM, wherein the plurality of perspectives are parameters and the plurality of features are values.
| # | Name | Date |
|---|---|---|
| 1 | 1526-MUM-2013-RELEVANT DOCUMENTS [17-08-2023(online)].pdf | 2023-08-17 |
| 1 | 1526-MUM-2013-Request For Certified Copy-Online(23-04-2014).pdf | 2014-04-23 |
| 2 | 1526-MUM-2013-US(14)-HearingNotice-(HearingDate-01-09-2023).pdf | 2023-07-28 |
| 2 | SPEC IN.pdf | 2018-08-11 |
| 3 | PD008820IN-SC_Request for Priority Documents.pdf | 2018-08-11 |
| 3 | 1526-MUM-2013-CLAIMS [10-10-2019(online)].pdf | 2019-10-10 |
| 4 | FORM 5.pdf | 2018-08-11 |
| 4 | 1526-MUM-2013-COMPLETE SPECIFICATION [10-10-2019(online)].pdf | 2019-10-10 |
| 5 | FORM 3.pdf | 2018-08-11 |
| 5 | 1526-MUM-2013-FER_SER_REPLY [10-10-2019(online)].pdf | 2019-10-10 |
| 6 | FIGURES IN.pdf | 2018-08-11 |
| 6 | 1526-MUM-2013-FORM 3 [10-10-2019(online)].pdf | 2019-10-10 |
| 7 | ABSTRACT1.jpg | 2018-08-11 |
| 7 | 1526-MUM-2013-OTHERS [10-10-2019(online)].pdf | 2019-10-10 |
| 8 | 1526-MUM-2013-FORM 26(26-6-2013).pdf | 2018-08-11 |
| 8 | 1526-MUM-2013-FER.pdf | 2019-04-12 |
| 9 | 1526-MUM-2013-CORRESPONDENCE(26-6-2013).pdf | 2018-08-11 |
| 9 | 1526-MUM-2013-FORM 18.pdf | 2018-08-11 |
| 10 | 1526-MUM-2013-CORRESPONDENCE(3-5-2013).pdf | 2018-08-11 |
| 10 | 1526-MUM-2013-FORM 1(3-5-2013).pdf | 2018-08-11 |
| 11 | 1526-MUM-2013-CORRESPONDENCE(3-5-2013).pdf | 2018-08-11 |
| 11 | 1526-MUM-2013-FORM 1(3-5-2013).pdf | 2018-08-11 |
| 12 | 1526-MUM-2013-CORRESPONDENCE(26-6-2013).pdf | 2018-08-11 |
| 12 | 1526-MUM-2013-FORM 18.pdf | 2018-08-11 |
| 13 | 1526-MUM-2013-FER.pdf | 2019-04-12 |
| 13 | 1526-MUM-2013-FORM 26(26-6-2013).pdf | 2018-08-11 |
| 14 | 1526-MUM-2013-OTHERS [10-10-2019(online)].pdf | 2019-10-10 |
| 14 | ABSTRACT1.jpg | 2018-08-11 |
| 15 | 1526-MUM-2013-FORM 3 [10-10-2019(online)].pdf | 2019-10-10 |
| 15 | FIGURES IN.pdf | 2018-08-11 |
| 16 | 1526-MUM-2013-FER_SER_REPLY [10-10-2019(online)].pdf | 2019-10-10 |
| 16 | FORM 3.pdf | 2018-08-11 |
| 17 | 1526-MUM-2013-COMPLETE SPECIFICATION [10-10-2019(online)].pdf | 2019-10-10 |
| 17 | FORM 5.pdf | 2018-08-11 |
| 18 | PD008820IN-SC_Request for Priority Documents.pdf | 2018-08-11 |
| 18 | 1526-MUM-2013-CLAIMS [10-10-2019(online)].pdf | 2019-10-10 |
| 19 | SPEC IN.pdf | 2018-08-11 |
| 19 | 1526-MUM-2013-US(14)-HearingNotice-(HearingDate-01-09-2023).pdf | 2023-07-28 |
| 20 | 1526-MUM-2013-Request For Certified Copy-Online(23-04-2014).pdf | 2014-04-23 |
| 20 | 1526-MUM-2013-RELEVANT DOCUMENTS [17-08-2023(online)].pdf | 2023-08-17 |
| 1 | 1526_10-04-2019.pdf |