Sign In to Follow Application
View All Documents & Correspondence

Sensor Modeling Techniques Evaluation Method And System

Abstract: Disclosed is a method and system for evaluating a sensor modeling software capable of modeling one or more sensors employed in a cyber-physical environment. An empirical expression module is enabled to configure a first empirical expression, a second empirical expression, and a third empirical expression. An assigning module is configured to assign weights for the significance of the features present in the first empirical expression, the second empirical expression, and the third empirical expression. The weights may be assigned based upon weighing parameters predefined by the user or system and the weighting parameters may vary based on the domain-specific implementation of the sensor modeling software. A computing module is configured to compute a coverage score, an availability score, and a programming design score. An evaluation module is configured to evaluate the sensor modeling software based upon the coverage score, the availability score, and the programming design score.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
25 January 2013
Publication Number
01/2015
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
ip@legasis.in
Parent Application

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India

Inventors

1. DASGUPTA, Ranjan
Tata Consultancy Services Ltd, Bengal Intelligent Park, Building - D Plot No. - A2 M2 & N2 Block -EP, Salt Lake Electronics Complex, Sector -V , Kolkata - 700091, West Bengal, India
2. CHATTOPADHYAY, Dhiman
Tata Consultancy Services Ltd, Bengal Intelligent Park, Building - D Plot No. - A2 M2 & N2 Block -EP, Salt Lake Electronics Complex, Sector -V , Kolkata - 700091, West Bengal, India
3. PAL Arpan
Tata Consultancy Services Ltd, Bengal Intelligent Park, Building - D Plot No. - A2 M2 & N2 Block -EP, Salt Lake Electronics Complex, Sector -V , Kolkata - 700091, West Bengal, India

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention: SENSOR MODELING SOFTWARE EVALUATION METHOD AND SYSTEM
APPLICANT:
Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point.
Mumbai 400021, Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.

CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[001] The present application claims priority to Indian Provisional Patent
Application No. 230/MUM/2013, filed on January 25th, 2013, the entirety of which is hereby incorporated by reference.
TECHNICAL FIELD
[002] The present subject matter described herein, in general, relates to system and
method for evaluating a sensor modeling software.
BACKGROUND
[003] Amongst the known Cyber Physical Systems (CPS) that are integrations of
computation and physical processes, the sensor and the sensor devices with diverse capabilities and complexities forms a major physical constituting elements. Modeling of the Cyber Physical Systems (CPS) requires ready sensor models that are derived from available sensor modeling techniques/sensor modeling software. However, each of these sensor modeling techniques is known to be accompanied by their own set of befitting and limiting features.
[004] While such multiple sensor modeling techniques are known in the art to obtain
requisite sensor models, there remains a conspicuous absence of an evaluation methodology and a system to select the modeling technique that best suits a particular domain specific application in terms of its applicability and usability. Neither is there any solution to quantitatively measure them before selecting any particular sensor modeling software or sensor modeling technique to model sensors and the CPS in its entirety. Thus, it may be a major challenge for anyone to rightly use any existing sensor modeling techniques or sensor model description methods for the right application.
SUMMARY
[005] This summary is provided to introduce aspects related to systems and methods
for evaluating a sensor modeling software capable of modeling one or more sensors employed in a cyber-physical environment and the aspects are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject

matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[006] In one implementation, a system for evaluating a sensor modeling software
capable of modeling one or more sensors employed in a cyber-physical environment is disclosed. The system comprises a processor and a memory coupled to the processor for executing a plurality of modules present in the memory. The plurality of modules comprises an empirical expression module, an assigning module, a computing module, an evaluation module, and a normalization module. The empirical expression module may be enabled to configure, for the sensor modeling software, a first empirical expression representing a coverage metric, a second empirical expression representing an availability metric, and a third empirical expression representing a programming design metric. Further, each of the first empirical expression, the second empirical expression, and the third empirical expression may comprise a set of terms indicating features corresponding to the sensor modeling software and significance of the features. Further, the assigning module may be configured to assign weights for the significance of the features present in the first empirical expression, the second empirical expression, and the third empirical expression. The weights may be assigned based on a domain-specific implementation of the sensor modeling software. Further, the computing module may be configured to compute a coverage score, an availability score, and a programming design score corresponding to the coverage metric, the availability metric, and the programming design metric respectively, based on the weights assigned. Further, the evaluation module may be configured to evaluate the sensor modeling software based upon the coverage score, the availability score, and the programming design score. Further, the normalization module may be configured to normalize the coverage score, the availability score, and the programming design score by performing decimal scaling,
[007] In another implementation, a method for evaluating a sensor modeling software
capable of modeling one or more sensors employed in a cyber-physical environment is disclosed. The method comprises configuring, for the sensor modeling software, a first empirical expression representing a coverage metric, a second empirical expression representing an availability metric, and a third empirical expression representing a programming design metric. Further, each of the first empirical expression, the second

empirical expression, and the third empirical expression may comprise a set of terms indicating features corresponding to the sensor modeling software and significance of the features. Further, the method may comprise assigning weights for the significance of the features present in the first empirical expression, the second empirical expression, and the third empirical expression. The weights assigned may be based on a domain-specific implementation of the sensor modeling software. The method may further comprise computing a coverage score, an availability score, and a programming design score corresponding to the coverage metric, the availability metric, and the programming design metric respectively, based on the weights assigned. Upon computing, the method may further comprise evaluating the sensor modeling software based upon the coverage score, the availability score, and the programming design score.
[008] Yet in another implementation, computer program product having embodied
thereon a computer program for evaluating a sensor modeling software capable of modeling one or more sensors employed in a cyber-physical environment is disclosed. The computer program product comprises a set of instructions, the instructions comprising instructions for configuring, for the sensor modeling software, a first empirical expression representing a coverage metric, a second empirical expression representing an availability metric, and a third empirical expression representing a programming design metric. Further, each of the first empirical expression, the second empirical expression, and the third empirical expression comprise a set of terms indicating features corresponding to the sensor modeling software and significance of the features. Further, a set of instructions may be provided for assigning weights for the significance of the features present in the first empirical expression, the second empirical expression, and the third empirical expression. The weights may be assigned based on a domain-specific implementation of the sensor modeling software. Further, a set of instructions may be provided for computing a coverage score, an availability score, and a programming design score corresponding to the coverage metric, the availability metric, and the programming design metric respectively, based on the weights assigned. Further, a set of instructions may be provided for evaluating the sensor modeling software based upon the coverage score, the availability score, and the programming design score.

BRIEF DESCRIPTION OF THE DRAWINGS
[009] The detailed description is described with reference to the accompanying
figures. In the figures, the Jeft-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[0010] Figure 1 illustrates a network implementation of a system for evaluating a
sensor modeling software is shown, in accordance with an embodiment of the present subject matter.
[0011] Figure 2 illustrates the system, in accordance with an embodiment of the
present subject matter.
[0012] Figure 3 illustrates a detail working of the system, in accordance with an
embodiment of the present subject matter.
[0013] Figure 4 illustrates a method for evaluating a sensor modeling software, in
accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0014] Systems and methods for evaluating a sensor modeling software are described.
The sensor modeling software may be capable of modeling sensors employed in a cyber-physical environment. In such environment, there may be 'n' number of software modeling software or modeling techniques available which may be required to be evaluated from purview of applicability and usefulness of the modeling techniques in a particular domain specific application. To evaluate software modeling software, the present subject matter discloses an effective and efficient mechanism for defining plurality of evaluation metrics in such a manner that with each evaluation metric defined, an empirical expression may be associated which may result in numerical value, thereby providing a quantitative measurement of the sensor modeling software.
[0015] At each stage of the configuration of the empirical expressions, all possible
feature of interest corresponding to the sensor modeling software may be considered. The feature of interest and their signification/importance are represented by a set of terms of the

empirical expressions. Further, the set of terms may either have a binary attribute or a numeric attribute. The binary attribute signifies "Yes/No" consideration of a feature of interest and the numeric attribute may signify the importance or significance of the feature of interest in the sensor modeling software. Since, the feature of interest and significance of the feature of interest are represented by the set of terms in each of the empirical expressions, weights may be assigned to the significance of the feature of interest. The weights may be assigned based on a domain-specific implementation of the sensor modeling software. According to some embodiments, user may assign the weights according to their choice which gives the users flexibility to prioritize their focus on faster implementation or better design or broader coverage for the selected domain-specific implementation.
[0016] Upon assigning the weights, each of the empirical expressions results in
generation of scores based on the weights assigned. Thus, on basis of the scores generated, the sensor modeling software may be evaluated and the user may be recommended the most suitable sensor modeling software from the purview of applicability and usefulness of the said sensor modeling software. Further, the scores generated may be normalized by performing decimal scaling of each of the scores generated. The normalization may be performed by decimal scaling of each of the scores and, further the scores are normalized in a manner that the normalized scores are less than or equal to unity.
[0017] While aspects of described system and method for evaluating a sensor
modeling software capable of modeling one or more sensors employed in a cyber-physical environment may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
[0018] Referring now to Figure 1, a network implementation 100 of a system 102 for
evaluating a sensor modeling software is illustrated, in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 provides an evaluation of sensor modeling software capable of modeling one or more sensors employed in a cyber-physical environment. For evaluating the sensor modeling software, plurality of evaluation metrics such as coverage metric, an availability metric, and a programming design metric may be defined. It is to be understood that the calculation of the plurality of metrics comprising the

coverage metric, the availability metric, and the programming design metric may be implemented in any sequence, and is not limited to any specific sequence. For example, the present disclosure may calculate the coverage metric, followed by the availability metric, and then the programming design metric. Similarly, in another example, the present disclosure may first calculate the programming design metric, followed by the availability metric, and then the coverage metric. In one embodiment, the system 102 configures a first empirical expression representing the coverage metric, a second empirical expression representing the availability metric, and a third empirical expression representing the programming design metric. Further, each of the empirical expressions configured may comprise a set of terms indicating features corresponding to the sensor modeling software and significance of the features. After configuring the empirical expressions, the system 102 may assign weights for the significance of the features present in the first empirical expression, the second empirical expression, and the third empirical expression. The weights may be assigned based on a domain-specific implementation of the sensor modeling software. In one embodiment, the system 102 may assign the weights based upon predefined weighing parameters comprising significance of different layers in the sensor modeling, licensing terms, features provided by the programming languages, and the like. The weighting parameters may vary based on the domain-specific implementation of the sensor modeling software. The weighing parameters may be predefined by the system 102 or by the user. The user is provided the flexibility to predefine the weighing parameters based upon requirements of the domain-specific implementation of the sensor modeling software. Further, the system 102 may be configured to compute a coverage score, an availability score, and a programming design score corresponding to the coverage metric, the availability metric, and the programming design metric respectively. Further, the system 102 may evaluate the sensor model software on basis of the scores computed i.e., the coverage score, the availability score, and the programming design score.
[0019] Although the present subject matter is explained considering that the system
102 is implemented on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a mathematical tool, a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a

network server, and the like. In one implementation, the system 102 may be implemented in a cloud-based environment. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2...104-N, collectively referred to as user devices 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The user devices 104 are communicatively coupled to the system 102 through a network 106.
[0020] In one implementation, the network 106 may be a wireless network, a wired
network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), Constrained Application Protocol (CoAP), Real-time Transport Protocol (RTP), Real Time Streaming Protocol (RTSP), Hypertext Transfer Protocol Secure (HTTPS), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0021] Referring now to Figure 2, the system 102 is illustrated in accordance with an
embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.
[0022] The I/O interface 204 may include a variety of software and hardware
interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with a user directly or through the client devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with

other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[0023] The memory 206 may include any computer-readable medium known in the art
including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random' access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 222.
[0024] The modules 208 include routines, programs, objects, components, data
structures, etc., which perform particular tasks, functions or implement particular abstract data types. In one implementation, the modules 208 may include an empirical expression module 210, an assigning module 212, a computing module 214, an evaluation module 216, normalization module 218, and other modules 220. The other modules 220 may include programs or coded instructions that supplement applications and functions of the system 102.
[0025] The data 222, amongst other things, serves as a repository for storing data
processed, received, and generated by one or more of the modules 208. The data 222 may also include as score database 224. a weight database 226, and other data 228. The other data 228 may include data generated as a result of the execution of one or more modules in the other modules 220.
[0026] In one implementation, at first, a user may use the client device 104 to access
the system 102 via the I/O interface 204. The user may register themselves using the I/O interface 204 in order to use the system 102. The working of the system 102 may be explained in detail in Figures 3 explained below. The system 102 may be used for evaluating a sensor modeling software capable of modeling one or more sensors employed in a cyber-physical environment. In order to evaluate the sensor modeling software, the system, at first, configures evaluation metrics in a form of empirical expressions. Specifically, in the present

implementation, the empirical expressions are configured by the empirical expression module 210.
[0027] Referring to Figure 3, a detailed working of the system 102 is illustrated, in
accordance with an embodiment of the present subject matter. There may be 'n' number of sensor modeling software 302 also known as "software modeling technique" available for modeling 'n' number of sensors in a cyber-physical environment. Among the 'n' number of sensor modeling software, the system 102 may be provided for evaluating most suitable sensor modeling software based on its applicability and usefulness for a particular domain-specific implementation. For evaluating the sensor modeling software, the present subject matter proposes three evaluation metrics i.e., coverage metric, an availability metric, and a programming design metric. According to some embodiments, empirical expressions may be configured for each of the evaluation metric proposed in the present subject matter. In one implementation, an empirical expression module 210 may be enabled to configure a first empirical expression representing the coverage metric, a second empirical expression representing the availability metric, and a third empirical expression representing the programming design metric. Each of the empirical expressions configured may comprise a set of terms indicating features (feature of interest) corresponding to the sensor modeling software and significance of the features.
[0028] Further, the set of terms may either have a binary attribute or a numeric
attribute. While evaluating the sensor modeling software, the binary attribute signifies "Yes/No" consideration of the features and the numeric attribute indicates the significance (importance) of the features by assigning weights.
[0029] According to embodiments of present subject matter, an assigning module 212
may be configured to assign weights corresponding to the significance of features present in each of the empirical expressions i.e., the first empirical expression, the second empirical expression, and the third empirical expression. The weights may be assigned based on a domain-specific implementation of the sensor modeling software, and thus, the weights may vary from one domain-specific implementation to another. Further, the weights assigned may be predefined and may be pre-stored in a weight database 226. In one implementation, the weights assigned may be based upon predefined weighing parameters comprising significance

of different layers in the sensor modeling, licensing terms, features provided by the programming languages, and the like. The weighting parameters may vary based on the domain-specific implementation of the sensor modeling software. In one embodiment, the weighing parameters may be predefined by the system 102 or by the user. The user is provided the flexibility to predefine the weighing parameters based upon requirements of the domain-specific implementation of the sensor modeling software. Accordingly, the weights assigned and stored in the weight database 226 may be configured and/or predefined by the user. As shown in figure 3, the user, via the Interface 204 may be enabled to predefine the weights stored in the weight database 226. The weights are predefined by the user based upon the said weighing parameters and the requirements of the domain-specific implementation of the sensor modeling software.
[0030] After assigning the weights, each of the empirical expression may result in a
numerical value which may be considered as a score corresponding to each of the evaluation metrics. Thus, a computing module 214 may be configured to compute the scores such as coverage score, an availability score, and a programming design score corresponding to the coverage metric, the availability metric, and the programming design metric respectively. Further, the score computed may be stored in a score database 224 of the system 102. In the following paragraphs, the computation of each of the scores is explained in detail.
[0031] In one implementation, the computing module 214 may be configured to
compute the coverage score by using a following first empirical expression as shown in equation 1.

[0032] From the equation 1, it may be seen that the Br" and Wji are the set of terms
associated with the first empirical expression. The CSji indicates the coverage score of ith feature assigned for a use case requirement and jth sensor modeling software. Further, the Bji (a term of the first empirical expression) indicates a quantitative value corresponding to the iih feature supported by the jth sensor modeling software based on binary attributes associated with the sensor modeling software, and the binary attributes may be "Yes" OR "No". Thus,

the quantitative value for the term Bji is 1 or 0 when the feature is either supported (binary attribute is "Yes") or not supported (binary attribute is '"No"), respectively, by the jth sensor modeling software. Further, the Wji (a term of the first empirical expression) indicates a weight of the ith feature assigned under a context of domain-specific implementation of the jth sensor modeling software. Thus, the weight assigned further indicates a significance of the ith feature in the jth sensor modeling software. Thus, based on the weights assigned, the coverage score is computed which may further provide a complete scope of sensor description by considering a seven-layer sensor modeling approach. According to embodiments of present subject matter, the seven-layer sensor modeling approach comprise following seven layers: Layer ]: Operating environment description.
Layer 2: Physical description.
Layer 3: Units and blocks description,
Layer 4: Signals and protocol description,
Layer 5: Measurements and processing description,
Layer 6: Networking protocol and configuration description, and
Layer 7: Interoperable data services description.
[0033] After computing the coverage score, the computing module 214 may be
configured to compute the availability score by using a following second empirical expression as shown in equation 2:

[0034] From the equation 2, it may be seen that 0ji and Wji are the set of terms
associated with the second empirical expression. The ASji indicates the average score of ith feature of jth sensor modeling software based on first set of parameters. Here, the Nl indicates total number of i feature according to the second, empirical expression. In one implementation, the ith feature may comprise a sensor model description format, software support, software tools, and usage interface. Further, the Oji (a term of the second empirical expression) indicates a quantitative value corresponding to software license availability for a

first set of parameters for the jth sensor modeling software based on binary attributes associated with the sensor modeling software, the binary attributes may be "Yes" OR "No". Thus, the quantitative value for the term Oji is 1 or 0 when the software license of the first set of parameters are either available (binary attribute is "Yes'-) or not available (binary attribute is "No"), respectively, by the jth sensor modeling software. According to some embodiments, the first set of parameters may comprise a programming language, software tools and technical support corresponding to the jth sensor modeling software. Further, the Wji indicates a weight of an ith type of the software license available based on weights assigned for predefined range of software licenses. Further, the predefined range of software licenses comprises a non-free software license, free closed-source license, paid closed-source license. paid viewable source license, free software foundation approved license, Open Source Initiative approved license, General purpose license (GPL 2) compatible strong copy left license, GPL 2- compatible weak copy left license, and GPL 2-incompatible and liberal copy left license.
[0035] After computing the availability score, the computing module 214 may be
further configured to compute the programming design score by using a following third empirical expression as shown in equation 3:

[0036] From the equation 3, it may be seen that Oi; Woi, Mj, Wmj, Rj, Wrj, WjS, G,
Wg, FP, WfP. IP and Wip are the set of terms associated with the third empirical expressions. The PDSj indicates the programming design score of the jth sensor modeling software. Here, the Nl indicates a total number of object oriented features supported by the sensor modeling software. Further, the Oj (a term of third empirical expression) indicates a quantitative value corresponding to evaluation of ith feature against the object oriented features based on binary attributes associated with the sensor modeling software, and the binary attributes may be "Yes" OR "No". Thus, the quantitative value for the term O; is 1 or 0 when the feature is successfully (binary attribute is "Yes") and non-successfully (binary attribute is "No"), respectively, evaluated against the object oriented features. Further, the Woi indicates a

weight of ith type of the oriented feature based on weights assigned for predefined object oriented features. According to embodiments of present subject matter, the weights (Woi) assigned for the predefined object oriented features may be as follows: Wol= 1 encapsulation and access control,
Wo2= 2 inheritance,
Wo3= 3 polymorphism function and operator overriding,
Wo4= 4 generic class and method, and
Wo5= 5 aspect oriented and model driven architecture.
[0037] Further, the Mj (a term of the third empirical expression) indicates a
quantitative value corresponding to evaluation of modeling standard of jth sensor modeling software against a modular software stack design based on binary attributes associated with the sensor modeling software, and the binary attributes may be "Yes" OR "No". Thus, the quantitative value for the term Mj is 1 or 0 when the modeling standard of fth sensor modeling software is successfully (binary attribute is "Yes") and non-successfully (binary attribute is "No"), respectively, evaluated against the modular software stack design. Further, the Wmj (a term of the third empirical expression) indicates weight of jth type of modeling standard based on weights assigned for predefined modeling standards of the modular software stack design. According to embodiments of present subject matter, the Wmj assigned for the predefined modeling standards may be as follows: Wmj =1: monolithic stack,
Wmj =2: modular and multiple layer design with tight coupling, and
Wmj =3: modular and multiple layer design with loose coupling.
[0038] Further, the Rj (a term of the third empirical expression) indicates a
quantitative value corresponding to evaluation of modeling standard format of the jth sensor modeling software against a predefined format based on binary attributes associated with the sensor modeling software, and the binary attribute may be "Yes" OR "No". Thus, the quantitative value for the term Rj is considered as 1 or 0 when the modeling standard format of jth sensor modeling software is successfully (binary attribute is "Yes") and non-successfully

(binary attribute is "No"), respectively, evaluated against the predefined format. Further, the Wrj (a term of the third empirical expression) indicates weight of jth type of modeling standard format assigned based on the predefined format. According to embodiments of present subject matter, the weight Wrj assigned for the predefined format are as follows: Wrj = 1: Proprietary format.
Wrj =2: Extensible Markup Language (XML),
Wrj =3: JavaScript Object Notation (JSON), and
Wrj =4; Web Ontology Language (OWL).
[0039] Further, the next term in the third empirical expression is Ws indicating a
weighted score of the jth sensor modeling software implementation based on plurality of software metrics. The WjS may be computed to determine maintainability, eloquence and optimization level of actual Cyber-physical system (CPS) sensor data acquisition implementation using a selected sensor modeling software. Further, the plurality of software metrics comprise a programming language metric, an average function point per Kilo Line of Code (KLOC) metric, and independent control flow path metric. Further, to compute the weighted score WjS the following equation may be used:

[0040] From the equation 4, it may be seen that the G and Wg indicates the
programming language metric and weighted score of the generation of the programming language metric, respectively. According to some embodiments of present subject matter, if there are N numbers of programming language metric then weighted score for ith generation of the programming language may be defined as a function of i and N asWx =f(i/N).
[0041] Further, FP and Wfp indicates an average function point per Kilo Line of Code
(KLOC) metric and weighted score of the average function point per KLOC metric. respectively. The function point per Kilo Line of Code metric determines the degree of optimized implementation. According to one embodiment, if x = FP/KLOC then efficiency increases when x >6 and that saturates when x=>20. Further, the Wfp may be defined as a

third order polynomial of x as shown in one embodiment of implementation as Wfp = -0.002χ3 + 0.0054χ2 + 0.0009A- + 0.2219.
[0042] Further, IP and Wip indicates a number of the independent control flow path
metric and weighted score of the number of independent control flow path metric, respectively. According to some embodiments, the complexity may increase when number of independent control flow path increases. Further, according to one embodiment, if x is denoted as independent control flow path then x is a measure of complexity, and then x may be a parameter to define the weight score for complexity. If x <=5. then program code is easily maintainable and complexity is very low. On the other hand if x>20, the complexity is high and program code is assumed to be hard to maintain. Now, the weighted score Wjp for independent control flow path may be defined as a step function in the form of

n > 0, a is real number. A. is a range of interval, and X is indicator function of A,. An indicator function denotes membership of an element in a subset, having the value as 1 for all elements of and the value as 0 for all elements not in the 4. According to one implementation, if three ranges of the Ai is defined having i valued from 0 to 5, 6 to 20, 21 toco, then Wip,may be defined as:
wip= 0.33χ[21,∞] +0-66χ[6.20] +lχ[0.5]
[0043] Here, the denominator (of the equation 3) contains summations of all possible
values of Wok, Wmk, Wrk, Wfk, where:
P1: range of object oriented weightages [1,2,3,4,5]
P2: range of modular architecture weightages [1, 2. 3]
P3: range of representation weightages [1, 2, 3, 4]
P4: range of language feature weightages [1, 2, 3, 4, 5].
[0044] Thus, on basis of the weights assigned for each of the term in the third
empirical expressions, the programming design score is computed. Thus, upon computing the

coverage score (CSij) from the equation 1, the availability score (ASji) from the equation 2. and the programming design score (PDSj) from the equation 3, a normalizing module 218 may be configured to normalize the coverage score, the availability score, and the programming design score prior to evaluating the sensor modeling software for generating a normalized coverage score, a normalized availability score, and a normalized programming design score. Further, the normalization of the scores may be performed by decimal scaling the coverage score, the availability score, and the programming design score in such a manner that each of the normalized coverage score, the normalized availability score, and the normalized programming design score is less than or equal to unity.
[0045] Thus, after computing and normalizing the coverage score, the availability
score, and the programming design score, an evaluation module 216 may be configured to evaluate the sensor modeling software. Thus, the user may rank and choose a most suitable sensor modeling software for modeling the sensors for a particular domain-specific application.
[0046] In one embodiment, considering a use case of smart metering which is a green
technological initiative is highly dependent on energy data received by means of short haul communication between meter node and data collector. Plethora of protocols and their proprietary way of data representation can be considered for their aptness in short haul communication rn smart metering. According to embodiments, an energy meter may be defined as a sensor using seven layer stack primarily to separate use case dependent data model from technology dependent meter communication protocol. While the lower layers of the stack provide maximum flexibility to handle various energy data representative of electricity, water, gas and heat, the middle layers support a range of existing and upcoming communication media to create communication independent interface. Finally the top layer describes a data model that specifies how various use cases e.g. demand response, multiple tariff, programmable tariff, historic storage, billing schedules can be supported and how the model can be further extended to support new use cases.
[0047] Further, the seven layer stack structure also supports interoperability among all
these energy metering services (sensor device services) and other enterprise services inside a standard Service Oriented Architectural (SOA) framework. Some existing technologies have

attempted specifying only the communication media dependent lower layers whereas some are self-contained specifying both the lower layers and data models. However, these are usually conflicting with each other and none of them exactly addresses the issue of inconsistency on the device level. Therefore it may be an utmost requirement to prepare a comprehensive data semantic independent energy meter data model that can be used over a wide range of communication media and allows interoperability among sensor data semantics, sensor data representation, sensor data collection, sensor data exchange, on the end device level. Accordingly in order to achieve the above requirement of creating a comprehensive smart energy meter data model, the most suitable sensor modeling technique is selected based on its applicability and usefulness by utilizing the present system and the method.
[0048] In one exemplary embodiment, the weightage(s) are first assigned ranging
from 1-5 across all seven layers. Considering a primary objective of separating out meter data model from its communication protocol as the main rationale behind choice of weights Wi for each layer, W7 is assigned a weight of 5, a maximum weightage to interoperable data services that manage diverse meter data semantics and create use case dependent meter data model; while W6 = W4 is assigned a weight of 4, 2nd most maximum weightage to event and network configuration that provides meter communication independent interface; W3 is assigned a weightage of 3 being a functional layer while rest all other layers are assigned to unity weightage. Obviously such weightage assignment will differ from one use case to another,
[0049] Further, the first empirical expression (i.e., equation 1) may be used to
calculate theoretically the normalized coverage score (CS) of all available sensor modeling software or software modeling techniques as presented in the Table I below. It is observed that both IEEE 1451 and Device Kit are equally useful for the present use case.

Seven Layer Stack Feature Description IEEE
1451 DDL Device
Kit Sensor ML Echonet
Operating Environment N N N N N
Physical N Y N N N
Pins & Ports N Y N N N

Signals & Protocols Y Y Y N N
Measurement & Process Y Y Y Y Y
Networking Protocol & Configuration Y N Y N N
Interoperable data services N N N N N
Normalized CS value 0.52 0.43 0.52 0.14 0.14
[0050] Any sensor modeling software uses a diverse set of language construct,
software tools, to create a software specification of a physical sensor. One technique represents sensor properties and interfaces in plain English whereas other uses Interface Definition Language (IDL), XML schemas together with verbal descriptions explaining the semantics. In addition some of the modeling technique also provide open source language processor, datasheet compiler as supporting tools to create sensor services.
[0051] In the current context of smart energy meter data modeling, Table 2 as shown
below presents different sensor modeling languages, sensor modeling tools and various software supported by different sensor modeling techniques. As per the definition of "Sensor modeling language, tools availability and support metric", each cell item in the Table 2 is assigned a weightage given in parenthesis according to the license type it has acquired while the use case invariant normalized availability score (AS) has been calculated using the second empirical expression i.e., Equation 2 (of computing AS). It is observed that Sensor ML is most useful for the present use case.

Features Standards

IEEE 1451 DDL Device Kit Sensor ML Echonet
Format IDL XML (4) XML (5) XML (5) XML (5) Class in plain text (1)
Software support Open 1451 (4) ATLAS
bundle
creator (1) DKML parser & Eclipse plug-in (4) 52North SWE stack
(5) NA

Features
Standards

IEEE 1451 DDL Device Kit Sensor ML Echonet
Tools TEDS reader/ writer (4) DDL
Language Processor
(5) Device Kit wizard / Service Activa-tor Toolki-t ( SAT)(1) Sensorml webtool
(5) NA
Usage interface Webservice (
STWS)
(4) OSGi
bundle
webservice (1) OSGi bundle ,pub sub service (1) SWE
webservices (5) Java object (1)
Normalized AS value 0.73 0.55 0.5 0.91 0.1
[0052] Further, it may be evident from the Table 1 that IEEE 1451 and Device Kit
have emerged as two top contenders as per the discussed use case, with coverage score as a principal metric of comparison. However empirical analysis of other metric appearing in Table 2 signifies that Sensor ML is ahead of IEEE 1451 method. A final score has been calculated adding the normalized scores and a score comparison is made for all available sensor modeling techniques based on above metrics which evaluates that IEEE 1451 is the right choice regarding its applicability, suitability and usefulness for a particular domain specific application.
[0053] Exemplary embodiments discussed above may provide certain advantages.
Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
[0054] Some embodiments of present subject matter enable the system 102 and the
method for evaluating the sensor modeling software from the purview of its applicability and usefulness in particular domain specific applications.
[0055] Some embodiments of present subject matter enable the system 102 and the
method for quantitatively measuring the sensor modeling software in order to facilitate a user to select most suitable sensor modeling software based on use case requirement.

[0056J Referring now to Figure 4, a method 400 for evaluating a sensor modeling
software capable of modeling one or more sensors is shown, in accordance with an embodiment of the present subject matter. The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc.. that perform particular functions or implement particular abstract data types. The method 400 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0057] The order in which the method 400 is described is not intended to be construed
as a limitation, and any number of the described method blocks can be combined in any order to implement the method 400 or alternate methods. Additionally, individual blocks may be deleted from the method 400 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 400 may be considered to be implemented in the above described system 102.
[0058] At block 402, a first empirical expression, a second empirical expression, and a
third empirical expression may be configured for the sensor modeling software. Further, the first empirical expression may represent coverage metric, a second empirical expression may represent an availability metric, and a third empirical expression may represent a programming design metric. Further, each of the first empirical expression, the second empirical expression, and the third empirical expression comprise a set of terms indicating features corresponding to the sensor modeling software and significance of the features.
[0059] At block 404, weights may be assigned for the significance of the features
present in the first empirical expression, the second empirical expression, and the third empirical expression. Further, the weights may be assigned based on a domain-specific implementation of the sensor modeling software. In one embodiment, the system 102 may

assign the weights based upon predefined weighing parameters comprising significance of different layers in the sensor modeling, licensing terms, features provided by the programming languages, and the like. The weighting parameters may vary based on the domain-specific implementation of the sensor modeling software. The weighing parameters may be predefined by the system 102 or by the user. The user is provided the flexibility to predefine the weighing parameters based upon requirements of the domain-specific implementation of the sensor modeling software.
[0060] At block 406, a coverage score, an availability score, and a programming
design score may be computed corresponding to the coverage metric, the availability metric. and the programming design metric, respectively.
[0061] At block 408, the sensor modeling software may be evaluated based upon the
coverage score, the availability score, and the programming design score,
[0062] At block 410, according to an embodiment of present subject matter, a
normalization of each of the coverage score, the availability score, and the programming design score may be performed by decimal scaling the coverage score, the availability score, and the programming design score. Further, the normalization may be performed in such a manner that each of the normalized coverage score, the normalized availability score, and the normalized programming design score is less than or equal to unity.
[0063] Although implementations for methods and systems for evaluating a sensor
modeling software capable of modeling one or more sensors employed in a cyber-physical environment have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for evaluating a sensor modeling software.

I/WE Claim:
1. A method for evaluating a sensor modeling software capable of modeling one or more sensors
employed in a cyber-physicai environment, the method comprising:
configuring, for the sensor modeling software, a first empirical expression representing a coverage metric, a second empirical expression representing an availability metric, and a third empirical expression representing a programming design metric, wherein each of the first empirical expression, the second empirical expression, and the third empirical expression comprise a set of terms indicating features corresponding to the sensor modeling software and significance of the features;
assigning weights for the significance of the features present in the first empirical expression, the second empirical expression, and the third empirical expression, wherein the weights are assigned based on a domain-specific implementation of the sensor modeling software;
computing a coverage score, an availability score, and a programming design score corresponding to the coverage metric, the availability metric, and the programming design metric, respectively, based on the weights assigned; and
evaluating the sensor modeling software based upon the coverage score, the availability score, and the programming design score.
2. The method of claim 1 further comprising normalizing the coverage score, the availability score, and the programming design score prior to evaluating the sensor modeling software for generating a normalized coverage score, a normalized availability score, and a normalized programming design score, wherein the normalization is performed by decimal scaling the coverage score, the availability score, and the programming design score, and wherein each of the normalized coverage score, the normalized availability score, and the normalized programming design score is less than or equal to unity.
3. A method of claim 1, wherein the coverage score is computed by using the first empirical


CSji indicate the coverage score of ith feature assigned for use case requirement and jth sensor modeling software,
Bji indicates a quantitative value corresponding to the ith feature supported by the jth sensor modeling software, wherein the quantitative value is 1 or 0 when the feature is either supported or not supported, respectively, by the jth sensor modeling software,
Wji indicates a weight of the ith feature assigned under a context of domain-specific implementation of the jth sensor modeling software, and wherein
Bji and Wji are the set of terms associated with the first empirical expression.
4. The method of claim 1, wherein the availability score is computed by using the second


empirical expression, that is,

ASji indicates the average score of ith feature of jth sensor modeling software based on first set of parameters, wherein the first set of parameters comprises programming language, programming tools availability and programming support,
Nl indicates total number of the ith feature, wherein the ith feature comprises a sensor model description format, software support, software tools, and usage interface,
Oji indicates a quantitative value corresponding to software license availability for the first set of parameters for the jth sensor modeling software, wherein the quantitative value is 1 or 0 when the software license of the first set of parameters are either available or not available, respectively, by the jth sensor modeling software.
W7 indicates a weight of an ith type of the software license based on weights assigned for predefined range of software licenses, and wherein
Oji and Wji are the set of terms associated with the second empirical expression.
5. The method of claim 4, wherein the predefined range of software licenses comprises a non-free software license, free closed-source license, paid closed-source license, paid viewable source license, free software foundation approved license, Open Source Initiative approved

license. General purpose license (GPL 2) compatible strong copy left license, GPL 2-compatible weak copy left license, and GPL 2-incompatible and liberal copy left license.
6. The method of claim 1, wherein the programming design score is computed by using the third empirical expressions, that is,

wherein
PDSj indicates the programming design score of the jlh sensor modeling software.
Nl indicates a total number of object oriented features supported by the sensor modeling software,
Oi indicates a quantitative value corresponding to evaluation of ith feature against the object oriented features, wherein the quantitative value is 1 or 0 when the feature is successfully and non-successfully, respectively, evaluated against the object oriented features.
Woi indicates a weight of ith type of the oriented feature based on weights assigned for predefined object oriented features,
Mj indicates a quantitative value corresponding to evaluation of modeling standard of jth sensor modeling software against a modular software stack design, wherein the quantitative value is 1 or 0 when the modeling standard of j1 sensor modeling software is successfully and non-successfully, respectively, evaluated against the modular software stack design,
Wmj indicates weight of jth type of modeling standard based on weights assigned for predefined modeling standards of the modular software stack design,
Rj indicates a quantitative value corresponding to evaluation of modeling standard format of the jth sensor modeling software against a predefined format, wherein the quantitative value is I or 0 when the modeling standard format of jth sensor modeling software is successfully and non-successfully, respectively, evaluated against the predefined format,

Wrj indicates weight of jth type of modeling standard format assigned based on the predefined format,
Wjs indicates a weighted score of the jth sensor modeling software based on plurality of software metrics, wherein the plurality of software metrics comprises programming language metric, average function point per Kilo Line of Code (KLOC) metric, and independent control flow path metric, and wherein the WjS is determined by using a following equation:
W,j = (Wj, *G + Wjip*FP + Wjip * IP) l{G + FP + IP), wherein
G and Wg indicates the programming language metric and weighted score of the generation of the programming language metric, respectively.
FP and Wfp indicates an average function point per Kilo Line of Code (KLOC) metric and weighted score of the average function point per KLOC metric, respectively,
IP and Wip indicates a number of the independent control flow path metric and weighted score of the number of independent control flow path metric, respectively, and wherein
Ois Woi, Mj, Wmj, Rj, Wrj, WJS, G, Wg, FP, Wfp, IP and Wip are the set of terms associated with the third empirical expressions.
7. The method of claim 1, wherein the weights may be assigned based upon predefined weighing parameters comprising significance of different layers in the sensor modeling, licensing terms, and features provided by the programming languages, and wherein the weighing parameters are predefined by the user or the system.
8. A system 102 for evaluating a sensor modeling software capable of modeling one or more sensors employed in a cyber-physical environment, the system 102 comprising:
a processor 202;
a memory 206 coupled with the processor 202, wherein the processor 202 is capable for executing a plurality of modules 208 stored in the memory 206, and wherein the plurality of modules 208 comprising:

empirical expression module 210 enabled to configure, for the sensor modeling software, a first empirical expression representing a coverage metric, a second empirical expression representing an availability metric, and a third empirical expression representing a programming design metric, wherein each of the first empirical expression, the second empirical expression, and the third empirical expression comprise a set of terms indicating features corresponding to the sensor modeling software and significance of the features;
assigning module 212 configured to assign weights for the significance of the features present in the first empirical expression, the second empirical expression, and the third empirical expression, wherein the weights are assigned based on a domain-specific implementation of the sensor modeling software;
computing module 214 configured to compute a coverage score, an availability score, and a programming design score corresponding to the coverage metric, the availability metric, and the programming design metric, respectively, based on the weights assigned; and
evaluation module 216 configured to evaluate the sensor modeling software based upon the coverage score, the availability score, and the programming design score.
9. The system 102 of claim 8 further comprising a normalization module 218 configured to normalize the coverage score, the availability score, and the programming design score prior to evaluating the sensor modeling software for generating a normalized coverage score, a normalized availability score, and a normalized programming design score, wherein the normalization is performed by decimal scaling the coverage score, the availability score, and the programming design score, and wherein each of the normalized coverage score, the normalized availability score, and the normalized programming design score is less than or equal to unity.

10. The system 102 of claim 8. wherein the computing module 214 is configured to compute the


coverage score by using the first empirical expression, that is,

wherein
CSji indicate the coverage score of ith feature assigned for use case requirement and jth sensor modeling software,
Bji indicates a quantitative value corresponding to the ith feature supported by the jth sensor modeling software, wherein the quantitative value is 1 or 0 when the feature is either supported or not supported, respectively, by the jth sensor modeling software,
Wji indicates a weight of the ith feature assigned under a context of domain-specific implementation of the jth sensor modeling software, and wherein Bji and Wji are the set of terms associated with the first empirical expression.
11. The system 102 of claim 8, wherein the computing module 234 is configured to compute the availability score by using the second empirical expression, that is.

ASji indicates the average score of 1th feature of jth sensor modeling software based on first set of parameters, wherein the first set of parameters comprises programming language, programming tools availability and programming support,
Nl indicates total number of the ith feature, wherein the 1th feature comprises a sensor model description format, software support, software tools, and usage interface.
Oji indicates a quantitative value corresponding to software license availability for the first set of parameters for the j' sensor modeling software, wherein the quantitative value is 1 or 0 when the software license of the first set of parameters are either available or not available, respectively, by the jth sensor modeling software,
Wji indicates a weight of an ith type of the software license based on weights assigned for predefined range of software licenses, and wherein Oji and Wji are the set of terms associated with the second empirical expression.

12. The system 102 of claim 11, wherein the predefined range of software licenses comprises a non-free software license, free closed-source license, paid closed-source license, paid viewable source license, free software foundation approved license, Open Source Initiative approved license. General purpose license (GPL 2) compatible strong copy left license, GPL 2- compatible weak copy left license, and GPL 2-incompatible and liberal copy left license.
13. The system 102 of claim 8, wherein the computing module 214 is configured to compute the programming design score by using the third empirical expressions, that is,

wherein
PDSj indicates the programming design score of the jth sensor modeling software,
Nl indicates a total number of object oriented features supported by the sensor modeling software,
Oj indicates a quantitative value corresponding to evaluation of ith feature against the object oriented features, wherein the quantitative value is 1 or 0 when the feature is successfully and non-successfully, respectively, evaluated against the object oriented features,
Woi indicates a weight of iih type of the oriented feature based on weights assigned for predefined object oriented features,
Mj indicates a quantitative value corresponding to evaluation of modeling standard of jth sensor modeling software against a modular software stack design, wherein the quantitative value is 1 or 0 when the modeling standard of jth sensor modeling software is successfully and non-successfully, respectively, evaluated against the modular software stack design,
Wmj indicates weight of jth type of modeling standard based on weights assigned for predefined modeling standards of the modular software stack design,
Rj indicates a quantitative value corresponding to evaluation of modeling standard format of the jth sensor modeling software against a predefined format,

wherein the quantitative value is 1 or 0 when the modeling standard format of jth sensor modeling software is successfully and non-successfully, respectively, evaluated against the predefined format,
Wrj indicates weight of jl type of modeling standard format assigned based on the predefined format,
Wjs indicates a weighted score of the jth sensor modeling software based on plurality of software metrics, wherein the plurality of software metrics comprises programming language metric, average function point per Kilo Line of Code (KLOC) metric, and independent control flow path metric, and wherein the WjS is determined by using a following equation:
Wsj = (Wjg*G + Wfp*FP + Wjip * IP) /(G + FP + IP), wherein
G and Wg indicates the programming language metric and weighted score of the generation of the programming language metric, respectively,
FP and Wfp indicates an average function point per Kilo Line of Code (KLOC) metric and weighted score of the average function point per KLOC metric, respectively,
IP and WiP indicates a number of the independent control flow path metric and weighted score of the number of independent control flow path metric, respectively, and wherein
Oi,Woi, Mj, Wmj, Rj, Wrj, WjS! G, Wg, FP, Wfp, IP and Wipare the set of terms associated with the third empirical expressions.
14. A computer program product having embodied thereon a computer program for evaluating a sensor modeling software capable of modeling one or more sensors employed in a cyber-physical environment, the computer program product comprising a set of instructions, the instructions comprising instructions for:
configuring, for the sensor modeling software, a first empirical expression representing a coverage metric, a second empirical expression representing an availability metric, and a third empirical expression representing a programming design metric, wherein each of the first empirical expression, the second empirical expression, and the third empirical

expression comprise a set of terms indicating features corresponding to the sensor modeling software and significance of the features;
assigning weights for the significance of the features present in the first empirical expression, the second empirical expression, and the third empirical expression, wherein the weights are assigned based on a domain-specific implementation of the sensor modeling software;
computing a coverage score, an availability score, and a programming design score corresponding to the coverage metric, the availability metric, and the programming design metric, respectively, based on the weights assigned; and
evaluating the sensor modeling software based upon the coverage score, the availability score, and the programming design score.

Documents

Application Documents

# Name Date
1 Provisional Spec.pdf 2018-08-11
2 ABSTRACT1.jpg 2018-08-11
3 230-MUM-2013-FORM 5(23-1-2014).pdf 2018-08-11
4 230-MUM-2013-FORM 3(23-1-2014).pdf 2018-08-11
5 230-MUM-2013-FORM 26(4-4-2013).pdf 2018-08-11
6 230-MUM-2013-FORM 2(TITLE PAGE)-(23-1-2014).pdf 2018-08-11
7 230-MUM-2013-FORM 2(23-1-2014).pdf 2018-08-11
8 230-MUM-2013-FORM 18(23-1-2014).pdf 2018-08-11
9 230-MUM-2013-FORM 1(18-2-2013).pdf 2018-08-11
10 230-MUM-2013-DRAWING(23-1-2014).pdf 2018-08-11
11 230-MUM-2013-DESCRIPTION(COMPLETE)-(23-1-2014).pdf 2018-08-11
12 230-MUM-2013-CORRESPONDENCE(4-4-2013).pdf 2018-08-11
13 230-MUM-2013-CORRESPONDENCE(23-1-2014).pdf 2018-08-11
14 230-MUM-2013-CORRESPONDENCE(18-2-2013).pdf 2018-08-11
15 230-MUM-2013-CLAIMS(23-1-2014).pdf 2018-08-11
16 230-MUM-2013-ABSTRACT(23-1-2014).pdf 2018-08-11
17 230-MUM-2013-FER.pdf 2019-01-15
18 230-MUM-2013-OTHERS [15-07-2019(online)].pdf 2019-07-15
19 230-MUM-2013-FER_SER_REPLY [15-07-2019(online)].pdf 2019-07-15
20 230-MUM-2013-COMPLETE SPECIFICATION [15-07-2019(online)].pdf 2019-07-15
21 230-MUM-2013-CLAIMS [15-07-2019(online)].pdf 2019-07-15
22 230-MUM-2013-FORM-26 [25-12-2020(online)].pdf 2020-12-25
23 230-MUM-2013-Correspondence to notify the Controller [25-12-2020(online)].pdf 2020-12-25
24 230-MUM-2013-Written submissions and relevant documents [16-01-2021(online)].pdf 2021-01-16
25 230-MUM-2013-US(14)-HearingNotice-(HearingDate-01-01-2021).pdf 2021-10-03

Search Strategy

1 search_09-01-2019.pdf