Abstract: Disclosed is system and method for controlling exposure of sensitive data. Receiving initial privacy score and operational privacy score for a user. Initial privacy score indicates access initially available to user associated with sensitive data. Operational privacy score indicates access dynamically available to user associated with sensitive data. Operational privacy score is revised based on based on a number of privacy units accessed while executing a task, to generate revised operational privacy score. Final privacy score for user is determined based on initial privacy score and revised operational privacy score. Final privacy score defines amount of restriction to be imposed on actions of user while accessing sensitive data. Final privacy score is enforced for the user while accessing the sensitive data to control exposure of sensitive data to the user.
DESC:
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
SYSTEM FOR CONTROLLING EXPOSURE OF SENSITIVE DATA
APPLICANT:
Tata Consultancy Services Limited
A Company Incorporated in India under The Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021,
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[001] The present application claims priority to Indian Provisional Application No. 3723/MUM/2013, filed on November 27, 2013, the entirety of which is hereby incorporated by reference.
TECHNICAL FIELD
[002] The present subject matter described herein, in general, relates to database privacy, and more particularly to controlling exposure of sensitive data from a database.
BACKGROUND
[003] Organizations having automated business processes need maintenance and handling of a production database. One of major problems faced by these organizations is a breach of sensitive data from the production database. The breach of sensitive data is nothing but an exposure of the sensitive data to unreliable environment. Often the maintenance of the production database is handed over to service providers. Database support teams need to access the sensitive data in order to carry out their job. Database support teams often face challenge of protecting customer’s sensitive data due to laws and regulations introduced by client’s countries.
[004] Although variety of data privacy technologies are being used to protect the sensitive data residing in the production database, there still exists a challenge of handling the sensitive data. The reason of insufficient data protection is that most of the data privacy technologies focus on desensitizing the sensitive information by applying rules or policies according to privacy laws. But the data privacy technologies do not address a main cause of breach of the data that is the exposure of the sensitive data to the unreliable environment or to unreliable individual. Further, current approach of the data privacy technologies is binary in nature, that is specifying whether user has access or not. This binary approach does not solve the problem of malicious use of the sensitive data from the production database.
SUMMARY
[005] This summary is provided to introduce aspects related to systems and methods for controlling exposure of sensitive data to a user and the aspects are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[006] In one implementation, a system for controlling exposure of sensitive data to a user is described. The system comprises a processor and a memory coupled to the processor. The processor is capable of executing a plurality of modules stored in the memory. The plurality of modules comprises a privacy score calculator, a privacy score enforcing module, a learning module, a restriction specifying module, a privacy protection module, a feedback module, and a task allocation module. The privacy score calculator receives an initial privacy score and an operational privacy score for the user. The initial privacy score indicates an amount of access initially available to the user associated with sensitive data, and the operational privacy score indicates an amount of access dynamically available to the user associated with the sensitive data. The sensitive data comprises a plurality of privacy units. The privacy score calculator further dynamically revises the operational privacy score for the user based on a number of privacy units accessed by the user while executing a task, to generate a revised operational privacy score. The privacy score enforcing module dynamically determines a final privacy score for the user based on the initial privacy score and the revised operational privacy score. The final privacy score defines an amount of restriction to be imposed on actions of the user while accessing the sensitive data. The privacy score enforcing module further enforces the final privacy score for the user while accessing the sensitive data to control exposure of sensitive data to the user. The learning module determines the initial privacy score for the user based on a plurality of parameters associated with the actions of the user related to the sensitive data. The restriction specifying module defines an access specification for the user associated with the sensitive data based on the final privacy score. The privacy protection module masks the sensitive data based on the final privacy score and the access specification associated with the sensitive data for the user, to control the exposure of the sensitive data to the user. The feedback module records information of the actions of the user related to the sensitive data, wherein the information comprises a database query, a type of attributes accessed, a size of a result set, a status of an operation performed by the user, restrictions of an operating table, restrictions of an operating column, restrictions of an operating schema, restrictions of an operating catalog, a number of columns accessed, a number of records accessed, a start time of the task, an end time of the task. The task allocation module allocates a task to the user based on at least one of a trust score, a work experience score, one or more revisions in the trust score, one or more revisions in the operational privacy score, and the final privacy score.
[007] In one implementation, a method for controlling exposure of sensitive data to a user is described. The method comprises receiving, by a processor, an initial privacy score and an operational privacy score for the user. The initial privacy score indicates an amount of access initially available to the user associated with sensitive data, and the operational privacy score indicates an amount of access dynamically available to the user associated with the sensitive data. The sensitive data comprises a plurality of privacy units. The method further comprises dynamically revising, by the processor, the operational privacy score for the user based on a number of privacy units accessed by the user while executing a task, to generate a revised operational privacy score. The method further comprises dynamically determining, by the processor, a final privacy score for the user, based on the initial privacy score and the revised operational privacy score. The final privacy score defines an amount of restriction to be imposed on actions of the user while accessing the sensitive data. The method further comprises enforcing, by the processor, the final privacy score for the user while accessing the sensitive data to control exposure of sensitive data to the user.
[008] In one implementation, a computer program product having embodied thereon a computer program controlling exposure of sensitive data to a user is described. The computer program product comprises a program code for receiving an initial privacy score and an operational privacy score for the user, wherein the initial privacy score indicates an amount of access initially available to the user associated with sensitive data, and the operational privacy score indicates an amount of access dynamically available to the user associated with the sensitive data, and wherein the sensitive data comprises a plurality of privacy units. The computer program product further comprises a program code for dynamically revising the operational privacy score for the user based on number of privacy units accessed by the user while executing a task, to generate a revised operational privacy score. The computer program product further comprises a program code for dynamically determining a final privacy score for the user based on the initial privacy score and the revised operational privacy score. The final privacy score defines an amount of restriction to be imposed on actions of the user while accessing the sensitive data. The computer program product further comprises a program code for enforcing the final privacy score for the user while accessing the sensitive data to control exposure of sensitive data to the user.
BRIEF DESCRIPTION OF THE DRAWINGS
[009] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[0010] Figure 1 illustrates a network implementation of a system for controlling exposure of sensitive data to a user, in accordance with an embodiment of the present subject matter.
[0011] Figure 2 illustrates the system of figure 1 for controlling exposure of sensitive data to the user, in accordance with an embodiment of the present subject matter.
[0012] Figure 3 illustrates a representation of working of the system for controlling exposure of the sensitive data to the user, in accordance with an embodiment of the present subject matter.
[0013] Figure 4 illustrates a representative data model for a bank production database, in accordance with an embodiment of the present subject matter.
[0014] Figure 5 illustrates a method for controlling exposure of sensitive data to a user, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0015] Systems and methods for controlling exposure of sensitive data to a user are described. The present subject matter discloses an effective and efficient mechanism for controlling exposure of the sensitive data to the user. The sensitive data may comprise a plurality of privacy units. For controlling exposure of the sensitive data, initially an initial privacy score for the user may be determined. The initial privacy score may be determined based on a plurality of parameters associated with actions performed by the user related to the sensitive data. Subsequent to determination of the initial privacy score for the user, an operational privacy score for the user may be determined. In another embodiment an initial privacy score and an operational privacy score for the user may be received initially. The initial privacy score indicates an amount of access initially available to the user associated with the sensitive data. The operational privacy score indicates an amount of access dynamically available to the user associated with the sensitive data. The operational privacy score may be revised based a number of privacy units accessed by the user while executing a task, to generate a revised operational privacy score. Further, a final privacy score may be determined based on the initial privacy score and the revised operational privacy score. The final privacy score may be enforced for the user while accessing the sensitive data to control exposure of the sensitive data to the user.
[0016] Based on the final privacy score, an access specification for the user associated with the sensitive data may be defined. Further the sensitive data may be masked based on the final privacy score and the access specifications for the user to control exposure of the sensitive data to the user.
[0017] According to one embodiment of the present disclosure, the system and method may be used to monitor and control exposure of the sensitive data to production support teams in production database support environment who often need to access the sensitive data residing in the database to carry out their job. The system and method may provide a solution to a problem of overexposure of data by bringing in attention to privacy leak and a data exposure problem by defining a limit on amount of sensitive data accessible to the user.
[0018] According to another embodiment of the present disclosure, a more granular method to monitor and control exposure of the sensitive data by defining a limited number of accessible records to the user is disclosed. Further, a limit on the number of accessible records to the user is based on user’s prior activities and user’s present activities.
[0019] While aspects of described system and method for controlling exposure of sensitive data to a user may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
[0020] Referring now to Figure 1, a network implementation 100 of a system 102 for controlling exposure of sensitive data to a user is illustrated, in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 determines an initial privacy score for the user. After determining the initial privacy score for the user, the system 102 at first receives an operational privacy score. The system 102 further dynamically revises the operational privacy score for the user to generate revised operational privacy score. Based upon the initial privacy score and the revised operational privacy score, the system further determines a final privacy score for the user. Further, based on the final privacy score, the system 102 defines an access specification for the user associated with the sensitive data. Based on the final privacy score and the access specifications associated with the sensitive data for the user, the system 102 masks the sensitive data to control exposure of the sensitive data to the user.
[0021] Although the present subject matter is explained considering that the system 102 is implemented on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. In one implementation, the system 102 may be implemented in a cloud-based environment. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2…104-N, collectively referred to as user devices 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The user devices 104 are communicatively coupled to the system 102 through a network 106.
[0022] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0023] Figure 3 illustrates representation of working of the system 102 for controlling exposure of the sensitive data to the user. Referring now to Figure 2 and Figure 3, the system 102 is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.
[0024] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with a user directly or through the client devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[0025] The memory 206 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[0026] The modules 208 include routines, programs, objects, components, data structures, and programmed instructions etc., which perform particular tasks, functions or implement particular abstract data types. In one implementation, the modules 208 may include a learning module 212, a privacy score calculator 214, a privacy score enforcing module 216, a restriction specifying module 218, a privacy protection module 220, a feedback module 222, a task allocation module 224, a privacy analytics module 226 and other modules 228. The other modules 228 may include programs or coded instructions that supplement applications and functions of the system 102.
[0027] The data 210, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The data 210 may also include a system database 230, and other data 232. The other data 232 may include data generated as a result of the execution of one or more modules in the other modules 228.
[0028] In one implementation, at first, a user may use the client device 104 to access the system 102 via the I/O interface 204. The user may register using the I/O interface 204 in order to use the system 102. The working of the system 102 may be explained in detail in Figures 2 and 3 explained below. The system 102 may be used for controlling exposure of sensitive data to a user.
[0029] In one exemplary implementation, the system 102 may be deployed in a production database environment. In the production database environment, production team members (user) may be assigned tasks which may need to be completed based on urgency. In order to complete the task, the production team member (user) may access or may update data from the production database through a set of database queries. If a production team member is given unlimited access to the production database, he or she (user) may end up learning too much information related to sensitive data and may be able to extract some sensitive data. The system 102 may address two major concerns that the sensitive data should not be inferable and the user should be able to carry out a job smoothly. Hence, in order to control exposure of the sensitive data to the user, the system 102, at first, determines an initial privacy score for the user. Specifically, in the present implementation, the initial privacy score for the user is determined by the learning module 212.
LEARNING MODULE
[0030] Referring to Figure 2 and figure 3, a detailed working of the learning module 212 along with the working of other components of the system 102 is illustrated, in accordance with an embodiment of the present subject matter. In one implementation, in order to determine the an initial privacy score for the user, the learning module 212 may, at first, mine records associated with the user from one or more sources such as the system database 224.
[0031] A privacy score of a user may be defined as an amount of access allocated to the user based on trustworthiness of the user. The privacy score may be allocated based on a number of factors and is subjected to continuous revisions based on inputs from various modules of the system 102. The privacy score is defined to add another layer of protection to the sensitive data being accessed outside the production environment through a mechanism of controlled viewing of the sensitive data. Further, the privacy score may decrease continuously based on the access of the sensitive data, and once the privacy score expires, the access of the user to the sensitive data may be prohibited. For example, the privacy score can be visualized as ‘Privacy wallet’ with currency units given to a user. For example, if a user’s privacy score is 100 that imply the user has given a wallet with 100 currency units (rupees) which the user can use while accessing the sensitive data. The privacy score may be transferrable and borrowable set. Units of the privacy score may be transferred or borrowed from another user.
[0032] In one implementation, the learning module may determine the initial privacy score for the user based on a plurality of parameters. The plurality parameters may be associated with actions performed by the user related to the sensitive data. The actions related to the sensitive data may be performed by the user while executing a task. The initial privacy score may indicate an amount of access initially available to the user related to the sensitive data. The learning module may monitor actions performed by the user. The monitoring of the actions may be used to further determine the initial privacy score for the user. The learning module may be a continuous service that may collect data associated with the actions performed by the user related to the sensitive data. The learning module may apply one or more machine learning techniques on the data associated with the actions performed by the user to generate the initial privacy score. The learning module may give the system 102 a starting point by providing the initial privacy score for each user.
[0033] The initial privacy score may be allocated to each user to define a starting point for actions of the user. The initial privacy score may be influenced by the actions of the user in a training period and when a testing period is initiated, the initial privacy score may be fixed and further the initial privacy score may be non-updatable. In one embodiment, an administrator of the system 102 may have a privilege to overwrite the initial privacy score for the user and define an appropriate privacy score for the user.
[0034] The plurality of parameters associated with the actions performed by the user may comprise a number of records accessed, an unauthorized access to a database, one or more successful execution of one or more database queries, one or more failed execution of one or more database queries, number of attempts made for executing a database query, syntax of a database query in each attempt, time taken to execute query, type of query executed. In another embodiment, the initial privacy score may be allocated based on some intelligent guess by an administrator of the system 102.
[0035] A formula to calculate the initial privacy score may be given by equation (1) as shown below.
Initial Privacy Score = Total no of records accessed – (records accessed in failed operation)…..Equation (1)
[0036] If there is a way to find out redundant successful operations from logs of the user /actions performed by the user, the formula to calculate the initial privacy score may be given by equation (2):
Initial Privacy Score = Total no of records accessed – (records accessed in failed operation) – (records accessed in redundant successful operations). …..Equation (2)
[0037] Third alternative to calculate the initial privacy score may be given by equation (3):
Initial Privacy Score = Total no of records accessed – (records accessed in failed operation) – (records accessed in redundant successful operations) – no. of unauthorized access * penalty …..Equation (3)
[0038] The redundant successful operations may be defined in two ways such as: 1) Redundant successful operation may be the operation executed by the user but not required for completion of a task. 2) Operation performed by the user might be required to execute to complete the task but the operation is not the correct one. Further a penalty may be any fixed/variable cost decided by the administrator for each unauthorized access/attempt to the database performed by the user.
PRIVACY SCORE CALCULATOR
[0039] After determining the initial privacy score, an operational privacy score for the user may be dynamically determined by the privacy score calculator 214. In one embodiment, the a privacy score calculator 214 at first, may receive an initial privacy score and an (initial) operational privacy score for the user. The initial privacy score may indicate an amount of access initially available to the user associated with the sensitive data. The operational privacy score may indicate an amount of access dynamically available to the user associated with the sensitive data. The sensitive data comprises a plurality of privacy units. Initially or at start of the system 102, the operational privacy score may be equal to the initial privacy score. Referring to figure 2 and figure 3, working of the privacy score calculator 214 is described below. The privacy score calculator 214 may be an engine that may determine the operational privacy score for the user based on actions of the user. The operational privacy score of the user may be dynamically revised by the privacy score calculator 214 based on a number of privacy units accessed by the user while executing a task, to generate a revised operational privacy score. Further, it must be understood that an action associated with a unit sensitive data may be linked with a number of points (weight). Also an access to a privacy unit may be linked with a number of points (weight). Further, it must be understood that an action associated with the privacy unit may be linked with a number of points (weight). Hence, the operational privacy score of the user may be reduced based on number of privacy units accessed by the user while accessing the sensitive data. The number of privacy units may accessed by the user while accessing the sensitive data to complete the task. Further, the operational privacy score of the user may be revised based on a database query executed by the user while accessing the sensitive data. The operational privacy score of the user may be revised based on the database query executed by the user while accessing the sensitive data in comparison with a benchmark query.
[0040] A trust score of the user may be defined as a combined measure of the user’s profile or experience level and a trust achieved by the user over a period of time. The trust score of the user may have a direct influence on the operational privacy score of the user and the operational privacy score of the user may be a variable measure which may fluctuate based on actions of the user.
[0041] According to another embodiment of the present disclosure, the trust score of the user may be revised based on the information associated with actions of the user. The operational privacy score of the user may therefore be revised based on the information associated with actions of the user. Every action of the user may be reviewed by another user of a higher trust score or a higher role. Based on a decision or recommendation of the user of the higher trust score or the higher role, the trust score of the user can be incremented or decremented. The privacy score calculator further may monitor change in the operational privacy score of the user and may ultimately generate a revised operational privacy score that may be used by other modules to generate a modified final privacy score. The calculation of the operational privacy score may comprise considering a parameter such as a sensitivity of the data accessed by the user while executing the task. By way of an example, SSN (Social Security Number) may be given more weight during accessing the data as compared to DOB (Date of Birth), then for viewing SSN value reduction in operational privacy score may be more than viewing the DOB value.
PRIVACY SCORE ENFORCING MODULE
[0042] After revising the operational privacy score for the user, a final privacy score for the user may be determined by the privacy score enforcing module 216. The privacy score enforcing module 216 may determine the final privacy score for the user based on the initial privacy score and the revised operational privacy score. The final privacy score may define an amount of restriction to be imposed on the actions of the user while accessing the sensitive data. The privacy score enforcing module may ensure that the user accessing a database adheres to the final privacy score assigned to him/her. In case of initial setup of the system 102, the initial privacy score may be enforced by the privacy score enforcing module 216. As the system 102 evolves over a period of time, the operational privacy score may get revised and the revised operational privacy score may be provided to the privacy score enforcing module 216. Based on the revised operational privacy score and the initial privacy score, the privacy score enforcing module 216 may determine how much access to the sensitive data to be allowed or prohibited to the user. The privacy score enforcing module 216 may generate the final privacy score for the user, and may enforce the final privacy score for the user and may monitor reductions in the operational privacy score of the user based on number of privacy units accessed by the user while executing the task.
[0043] The privacy score enforcing module may enforce the final privacy score for the user while accessing sensitive data. The privacy score enforcing module 216 in order to enforce the final privacy score for the user may prohibit access for the user to the sensitive data on exhaustion of the final privacy score. The privacy score enforcing module may accept two input variants of the privacy score namely, the initial privacy score and the revised operational privacy score and may determine the final privacy score that needs to be enforced. The formula for calculation of the final privacy score may be defined by Equation (3)
Final privacy score: min (initial privacy score, Operational privacy score)……. Equation (4)
In another embodiment, the final privacy score may be revised by an administrator. The Administrator may also keep the revised operational privacy score as the final privacy score.
RESTRICTION SPECIFYING MODULE
[0044] Subsequent to determination of the final privacy score, an access specification for the user associated with the sensitive data may be defined by the restriction specifying module 218. The restriction specifying module 218 may define the access specification for the user associated with the sensitive data based on the final privacy score. The restriction specifying module 218 may also define one or more operations available to the user to perform on the sensitive data. The restriction specifying module 218 may specify restrictions/ the access specification that are to be imposed on actions of the user by the privacy score enforcing module 216 while accessing the sensitive data within the production environment. The access specification may be defined based on various parameters such as the final privacy score that the user may have remaining and/or universal restriction which the system administrator may impose on the sensitive data or database elements such as tables, columns, schemas, or catalogs. The restriction specifying module 218 may determine the access specifications on the sensitive data or sensitive database elements and may provide instructions to the user about the set of operations that are not allowed to be performed by the user related to the sensitive data.
[0045] In another embodiment, the restriction specifying module 218 may specify one or more masking techniques to be applied on the sensitive data. Further, the restriction specifying module 218 may specify a restriction in terms of privacy units. In one embodiment, an administrator of the system may decide which one or more database element constitutes a privacy unit from the database. In one example, the database may be a production database.
[0046] According to an exemplary embodiment, a concept of privacy units is described. Figure 4 depicts a representative data model for bank production database. The representative data model has entities such as BRANCH, ACCOUNT, CUSOMTER, EMPLOYEE and TRANSACTION_DETAILS. In the ACCOUNT entity, {Acc_No} may be defined as one privacy unit. {Acc_no, Acc_balance} may be defined as another privacy unit. Individually defining {Acc_balance} as the privacy unit may not be significant, since individually defining {Acc_balance} as the privacy unit may not prohibit the user with malicious intent to derive the sensitive data. In another embodiment every attribute of an entity may be defined as a privacy unit. The advantages of defining the every attribute of the entity as the privacy unit may be three folds. First advantage may be that the user may not access the information from the database if the information is not required. Otherwise user’s privacy score may get reduced based on usage of the sensitive data. Continuous reduction of the privacy score may automatically put restriction on a number of queries to use to extract the sensitive data. Since the user may not fire more queries and retrieve only necessary data, this may reduce a load on the database.
[0047] Further, the privacy unit may carry different weights. For example, individually {Expiry_Date}, {CVV} and {Credit Card Number} attributes may be charged as 3, 5 and 6 times weight per unit accessed. But {CVV, Credit_Card_Number} might be charged as 100 times weight per unit accessed and not as (11= 5 + 6) times weight. The privacy unit consisting of {CVV, Credit_Card_Number} is quite important combination of the sensitive data. This combination of {CVV, Credit_Card_Number} should be used in the rarest of rare case so penalty for this combination should be high.
PRIVACY PROTECTION MODULE
[0048] After defining the access specification for the user associated with the sensitive data, the sensitive data may be masked by the privacy protection module 220. The privacy protection module 220 may mask the sensitive data based on the final privacy score and the access specifications associated with the sensitive data for the user, to control exposure of the sensitive data to the user. The privacy protection module 220 may be a real time masking engine. In one embodiment, the privacy protection module 220 may mask the sensitive data from a production environment when the sensitive data is accessed from an unsecured location. In one embodiment, the privacy protection module 220 may masks the sensitive data, the privacy protection module 220 may not inhibit the user having access to the sensitive data. The privacy protection module 220 may block one or more database queries from the user if the user does not have sufficient/required operational/final privacy score to execute the one or more database queries. In another words, privacy protection module 220 may block one or more queries requested by the user if required privacy score to execute the one or more queries or a task exceeds the operational privacy score of the user.
[0049] In one embodiment of the present disclosure, the privacy protection module 220 may mask the sensitive data that is returned from the production environment, and together with the restriction specifying module 218 may define a secure transaction via a query execution console.
[0050] In another embodiment, if the one or more database queries of the user are blocked and there is urgency for the user to complete a task, then the privacy score for the user may be borrowed from another user or may be updated by a higher role user or an administrator of the system 102. The user may borrow the privacy score in multiple ways. One way to borrow the privacy score is to get the privacy score from a next day’s privacy score or a next month’s privacy score. Another way may be to borrow from other user who has more privacy score and the other user is willing to transfer his privacy score to other user’s wallet. By way of borrowing or updating the final privacy score, the task may be completed without blocking the task indefinitely.
FEEDBACK MODULE
[0051] Subsequent to masking the sensitive data, information of associated to actions of the user associated with the sensitive data may be recorded by the feedback module 222. The information may comprise a database query, a type of attributes accessed, a size of a result set, a status (success/failure) of an operation performed by the user, restrictions of an operating table, restrictions of an operating column, restrictions of an operating schema, restrictions of an operating catalog, a number of columns accessed, a number of records accessed, a start time of the task, an end time of the task.
[0052] In one embodiment of the present disclosure, the feedback module 222 may be coupled with every action of the user while interacting with the sensitive data. In one example, the sensitive data may be residing in the production environment. The feedback module 222 may record the information and may report the information to a higher role user or to a user having higher trust score. The feedback module may help the higher role user in determining validity of a query used by the user. The higher role user by using the information provided the feedback module 222 can determine whether the query is executed in optimal way or not. Further, a validity of a result set retrieved by the user may be analyzed by the higher role user. The validity of the result set may be analyzed by analyzing whether a size of the result set is most optimal in fixing a problem or in executing the task. Based on the information provided by the feedback module 222, the operational privacy score of the user may be revised. The analysis of utilization of the final privacy score may solve a privacy problem related to differential privacy, since by means of the final privacy score a control on access of the sensitive data can be obtained.
[0053] The feedback module 222 may record the actions performed by the user while execution of one or more tasks. The feedback module 222 may be accessed by a higher role user having higher trust score than the user assigned with the one or more tasks. The higher role user may analyze the privacy score utilization of the user and one or more database queries executed by the user and/or skills of the user. Analysis of the privacy score utilization of the user while accessing particular sensitive data may include analyzing whether the user has consumed more privacy score/privacy units or less privacy score/ privacy units than a privacy score/privacy units specified in benchmark privacy score/ privacy units utilization to access particular sensitive data. In one example, the privacy score/ privacy units consumed by the user may be similar to the required privacy units to complete the task. The higher role user may analyze a difference in utilization of the privacy score/ privacy units and may provide an appropriate feedback to the user.
[0054] According to an exemplary embodiment, calculation of work experience score is described. The work experience score may be a query skills score. In one embodiment, calculation of the query skills score by analyzing the query skills is described below. The analysis of a query skills of the user may include parameters comprising a number of attempts made for a query, a syntax of a query in each attempt , one or more syntax errors, a time taken to execute a query, a type of query executed such as a Delete, an Update, a Select. The analysis of the database query skills of the user may include any other parameters known to a person skilled in the art. The query may be a database query. The parameters may be generic to any type of query and may not be limited only to SQL query only. A weight may be assigned to each parameter of the parameters. The weight of the parameter may be configurable. For example, an administrator may assign more weight to a time taken to execute the query than to a number of attempts made for the query. For example, the syntax of the query may have a gradual decline in the weight, if the users become skillful enough to write the query and the users may not make syntactical errors while completing the task. Type of queries and corresponding weight may be specified by the administrator. For example, a delete query may be given more weight than an update query, as a deleting action is quite unsafe or sensitive than an update action. Similarly, an update query may receive more weight than a select query, as updating action is more unsafe or sensitive than a selecting action. Within, a select query some malicious select queries may be assigned more weight than a normal select query. The malicious select queries may comprise a query including one or more sensitive fields, a query using one or more arithmetic operator or functions to break a masking technique.
[0055] An example of a weight given to the parameters is shown in a Table 1 below. The weight may be given in terms of points as shown in the Table 1.
Parameter Contribute (points) Deduct (points)
A number of attempts (made for each query) (Applicable from the second attempt) … DELETE : 1
UPDATE : 2
SELECT : 3
Syntax of query (For single attempt) …. 5 (some fix numbers) (can be reviewed by higher trustworthy associate and assigned manually)
Time taken to execute query (Thinking time not execution time) ….. 5 (For each unit delay) (Unit might be of 15 seconds, 30 seconds, 1 Minute)
Query execution (Success) DELETE : 50
UPDATE: 30
SELECT : 10 …..
Query execution (failure) ….. DELETE : 50
UPDATE : 30
SELECT : 10
Query malicious (SELECT QUERY) …… 50
The above said parameters may be used to calculate the work experience score (in the current context the query skills score). The weights given to the parameters are not fixed and may be decided and changed by the administrator of the sytem 102.By way of an example, a formula for calculation of the work_experience_score (query_skills_score) is give by Equation (5)
work_experience_score (query_skills_score) = w1 * num_of_attemps + w2 * query_syntax + w3 * time_execute + w4 * success_query + w5 * failed_query. ……Equation (5)
Wherein w1, w2, w3, w4, w5 are values decided by administrator.
[0056] According to an exemplary embodiment, a revision of the operational privacy score of the user is illustrated. The operational privacy score may be revised in two ways. The operational privacy score may be revised based on a total number of privacy units retrieved in a result set to complete a task. The operational privacy score may be revised by taking a no of privacy units of a benchmark query as a base and only deduct the difference of the privacy unit between the total number of privacy units retrieved by the user’s query and the no of privacy units of the benchmark query. In another scenario, there may be increment in the operational privacy score of the user if the user to complete the task in less number of privacy units than the number of privacy units suggested by the benchmark query. Further, the query used by the user to complete the task in less number of privacy units than the number of privacy units suggested by the benchmark query may be selected as a new benchmark for similar type of tasks.
[0057] By way of an example, a formula for revising the operational privacy score is given by Equation (6) and Equation (7) below. The final privacy score may be further calculated based on the revised operational score as given in Equation (8).
Revised operational privacy score = Current operational privacy score – Total number of privacy units accessed to complete the task ……. Equation (6)
Revised operational privacy score = Current operational privacy score – (Total number of privacy units accessed to complete the task – Benchmark Privacy units)…… Equation (7)
Final privacy score = min (initial privacy score, Revised operational privacy score)……. Equation (8).
In another embodiment, the formula for the final privacy score may be overridden by the administrator. . The formula for the final privacy score may be revised by the administrator. Administrator may decide to keep the revised operational privacy score as final score.
TASK ALLOCATION MODULE
[0058] According to another embodiment, a task may be allocated to the user by the task allocation module 224. The task allocation module 224 may allocate the task to the user based on at least one of a trust score, a work experience score, one or more revisions in the trust score, one or more revisions in the operational privacy score and the final privacy score. The trust score of the user is calculated based on two parameters: work experience score (in the current context it is query skills score) and the revised operational privacy score. The trust score may be calculated based on various formulas. Formulas and weight to each parameter may be decided by the administrator of the system 102. The exemplary formula for trust score calculation is given by Equation (9)
. Trust_score = w1 * query_skills_score + w2 * operational_privacy_score. ……..Euation (9)
Wherein w1 + w2 = 1.
[0059] The task allocation module 224 may allocate a task to the user based on at least one of the trust score, the work experience score, the one or more revisions in the trust score, the one or more revisions in the operational privacy score, the final privacy score and a decision of a higher role user who determines which user may receive which task. The allocation of the task is a dynamic process comprising selecting a task from a pool of tasks and selecting a user from a pool of users and allocating a task to the user, wherein the user is suitable to complete the task. The task allocation module may achieve a uniform distribution of responsibilities and work load across various levels of trustworthy users. The user may be an associate. A user having higher trust score may receive a higher work load executable on the sensitive data and may also receive numerous review tasks as a real case scenario. In order to avoid this scenario and make the system 102 more intuitive, the task allocation module may vary workload of the users related to the sensitive data based on the review tasks assigned to the user, thereby making the user perform more optimally.
[0060] The task allocation module may allocate the task from the pool of tasks to the user based on following strategy. The strategy may be designed to equally distribute the tasks among the users and also improve skills set of the users. The strategy may meet the following goals: 1) The tasks should be distributed reasonably and uniformly 2) The user’s trust score should be utilized fairly 3) User’s trust score should increase with time. Hence as per the strategy, the task allocation module 224 may allocate the tasks which require more skills to a user with higher trust score. Further, tasks which require more skills but do not have stringent service level agreement, may be allocated to a user with lower trust score. This strategy might consider only the users who have lower trust score due to lower skill score but not due to lower privacy score.
PRIVACY ANALYTICS MODULE
[0061] The privacy analytics module 226 may carry out analytics associated with actions logged within the system 102. The actions logged within the system 102 may be performed by the users and the system 102 as well. The analytics may unlock various trends which may be present and may be currently undiscovered within the database and usage of the database. The various trends which may be studied may include user’s follow up with respect to privacy of the sensitive data, user’s progress on the skill front, meeting of SLAs of tasks, reasons for not meeting the SLA’s of the tasks, assess trustworthiness of users, increase or decrease in user’s trustworthiness.
[0062] The analytics carried out by the privacy analytics module 226 may potentially improve an efficiency of the production environments possessing sensitive data. The privacy analytics module may improve the efficiency of the production environments by exposing some gaping lapses in maintaining privacy of the sensitive data or suggesting better and more effective ways for maintaining the privacy of the sensitive data. In one embodiment, the particular module may work as a plug-in to improve efficiency of the database system or the production environment.
[0063] The exemplary embodiments discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
[0064] Some embodiments enable a system and a method for controlling exposure of sensitive data by dynamically determining privacy score for the user, updating the privacy score based on actions of the user and enforcing the privacy score for the user.
[0065] Some embodiments enable a system and a method to assess trustworthiness of the user by analyzing actions performed by the user and utilization of the privacy score by the user.
[0066] Some embodiments enable a system and a method to dynamically allocate a task to the user based on one or more revisions in the trust score, one or more revisions in operational privacy score and final privacy score.
[0067] Some embodiments enable a system and a method to achieve a uniform distribution of responsibilities and work load across various levels of trustworthy users.
[0068] Some embodiments enable a system and a method to perform analytics to potentially improve an efficiency of the production environments possessing sensitive data.
[0069] Some embodiments enable a system and a method to improve skills of production support team/users of the system.
[0070] Some embodiments enable a system and a method to prevent an unintentional damage to data through increase in the reliability of production support team/ users of the system.
[0071] Referring now to Figure 5, a method 500 for controlling exposure of sensitive data to a user is shown, in accordance with an embodiment of the present subject matter. The method 500 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 500 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0072] The order in which the method 500 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 500 or alternate methods. Additionally, individual blocks may be deleted from the method 500 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 500 may be considered to be implemented in the above described system 102.
[0073] At block 502, an initial privacy score and an operational privacy score for the user may be received. The initial privacy score may indicate an amount of access initially available to the user associated with the sensitive data. The operational privacy score may indicate an amount of access dynamically available to the user associated with the sensitive data.
[0074] In another embodiment, at block 502, an initial privacy score for a user may be determined. The initial privacy score may be determined based on plurality of parameters associated with actions performed by the user related to the sensitive data. The initial privacy score may be determined based on plurality of parameters associated with actions performed by the user related to the sensitive data while executing a task. The initial privacy score may indicate an amount of access initially available to the user associated with the sensitive data. In one implementation, the initial privacy score for the user may be determined by the learning module 212. The plurality of parameters associated with actions performed by the user may comprise a number of records accessed, an unauthorized access to a database, one or more successful execution of one or more database queries, one or more failed execution of one or more database queries, number of attempts made for executing a database query, syntax of a database query in each attempt, time taken to execute query, type of query executed.
[0075] In another embodiment, at block 504, an operational privacy score for the user may be dynamically determined. Initially, at a start of system an operational privacy score for the user may be received. In one embodiment, initially at a start of system the operational privacy score for the user may be same as the initial privacy score. The operational privacy score may indicate an amount of access dynamically available to the user associated with the sensitive data. At block 504, the operational privacy score for the user may be dynamically revised based on based on a number of privacy units accessed by the user while executing a task, to generate a revised operational privacy score. In one implementation, the operational privacy score for the user may be dynamically revised by the privacy score calculator 214.
[0076] At block 506, a final privacy score for the user may be dynamically determined. The final privacy score for the user may be determined based on the initial privacy score and the revised operational privacy score. The final privacy score may define an amount of restriction to be imposed on actions of the user while accessing the sensitive data. In one implementation, the final privacy score for the user may be determined by the privacy score enforcing module 216.
[0077] At block 508, the final privacy score for the user may be enforced while accessing the sensitive data to control exposure of sensitive data to the user
[0078] At block 510, an access specification for the user may be defined. An access specification for the user may be defined associated with the sensitive data based on the final privacy score. In one implementation, the access specification for the user may be defined by the restriction specifying module 218.
[0079] At block 512, the sensitive data may be masked to control exposure of the sensitive data to the user. The sensitive data may be masked based on the final privacy score and the access specification associated with the sensitive data for the user. In one implementation, the sensitive data may be masked by the privacy protection module 220.
[0080] The method 500 further may comprise recording information of actions performed by the user related to the sensitive data, wherein the information comprises a database query, a type of attributes accessed, a size of a result set, a status of an operation performed by the user, restrictions of an operating table, restrictions of an operating column, restrictions of an operating schema, restrictions of an operating catalog, a number of columns accessed, a number of records accessed, a start time of the task, an end time of the task. In one implementation, information of actions performed by the user related to the sensitive data may be recorded by the feedback module 222.
[0081] The method 500 further may comprise reducing the operational privacy score of the user based on number of privacy units consumed by the user while accessing the sensitive data to complete the task, wherein an activity associated with a unit sensitive data is linked with a number of privacy units.
[0082] The method 500 further may comprise revising the trust score and the operational privacy score based on the information of the actions of the user. The method 500 further may comprise revising the operational privacy score of the user based on database query executed by the user while accessing the sensitive data in comparison with a benchmark query.
[0083] The method 500 further may comprise allocating a task to the user based on at least one of a trust score, a work experience score, one or more revisions in the trust score, one or more revisions in the operational privacy score and the final privacy score. In one implementation, the task may be allocated to the user by the task allocation module 224.
[0084] The method 500 further may comprise enforcing the final privacy score for the user while accessing the sensitive data. The enforcing of the final privacy score for the user may comprise prohibiting access for the user to the sensitive data on exhaustion of the final privacy score. The enforcing of the final privacy score for the user may comprise blocking one or more database queries of the user while accessing the sensitive data, when the user is not having a required privacy score to execute the one or more database queries.
[0085] Although implementations for methods and systems for controlling exposure of sensitive data to a user have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for controlling exposure of sensitive data to a user. ,CLAIMS:WE CLAIM:
1. A system for controlling exposure of sensitive data to a user, the system comprising:
a processor; and
a memory coupled to the processor, wherein the processor is capable of executing a plurality of modules stored in the memory, and wherein the plurality of modules comprising:
a privacy score calculator to:
receive an initial privacy score and an operational privacy score for the user, wherein the initial privacy score indicates an amount of access initially available to the user associated with sensitive data, and the operational privacy score indicates an amount of access dynamically available to the user associated with the sensitive data, and wherein the sensitive data comprises a plurality of privacy units;
dynamically revise the operational privacy score for the user based on a number of privacy units accessed by the user while executing a task, to generate a revised operational privacy score; and
a privacy score enforcing module to:
dynamically determine a final privacy score for the user based on the initial privacy score and the revised operational privacy score, wherein the final privacy score defines an amount of restriction to be imposed on actions of the user while accessing the sensitive data; and
enforce the final privacy score for the user while accessing the sensitive data to control exposure of sensitive data to the user.
2. The system of claim 1further comprises a learning module to determine the initial privacy score for the user based on a plurality of parameters associated with the actions of the user related to the sensitive data.
3. The system of claim 2, wherein the plurality of parameters associated with the actions of the user comprises a number of records accessed, an unauthorized access to a database, one or more successful execution of one or more database queries, one or more failed execution of one or more database queries, number of attempts made for executing a database query, syntax of a database query in each attempt, time taken to execute a database query, type of a query executed.
4. The system of claim 1 further comprises a restriction specifying module to define an access specification for the user associated with the sensitive data based on the final privacy score.
5. The system of claim 1 further comprises a privacy protection module to mask the sensitive data based on the final privacy score and the access specification associated with the sensitive data for the user, to control the exposure of the sensitive data to the user.
6. The system of claim 1 further comprising a feedback module to record information of the actions of the user related to the sensitive data, wherein the information comprises a database query, a type of attributes accessed, a size of a result set, a status of an operation performed by the user, restrictions of an operating table, restrictions of an operating column, restrictions of an operating schema, restrictions of an operating catalog, a number of columns accessed, a number of records accessed, a start time of the task, an end time of the task.
7. The system of claim 1 further comprises a task allocation module to allocate a task to the user based on at least one of a trust score, a work experience score, one or more revisions in the trust score, one or more revisions in the operational privacy score, and the final privacy score.
8. A method for controlling exposure of sensitive data to a user, the method comprising:
receiving, by a processor, an initial privacy score and an operational privacy score for the user, wherein the initial privacy score indicates an amount of access initially available to the user associated with sensitive data, and the operational privacy score indicates an amount of access dynamically available to the user associated with the sensitive data, and wherein the sensitive data comprises a plurality of privacy units;
dynamically revising, by the processor, the operational privacy score for the user based on a number of privacy units accessed by the user while executing a task, to generate a revised operational privacy score;
dynamically determining, by the processor, a final privacy score for the user, based on the initial privacy score and the revised operational privacy score, wherein the final privacy score defines an amount of restriction to be imposed on actions of the user while accessing the sensitive data; and
enforcing, by the processor, the final privacy score for the user while accessing the sensitive data to control exposure of sensitive data to the user.
9. The method of claim 8 further comprises determining the initial privacy score for the user based on a plurality of parameters associated with the actions of the user related to the sensitive data.
10. The method of claim 9, wherein the plurality of parameters associated with the actions of the user comprises a number of records accessed, an unauthorized access to a database, one or more successful execution of one or more database queries, one or more failed execution of one or more database queries, a number of attempts made for executing a database query, a syntax of a database query in each attempt, a time taken to execute query, a type of query executed.
11. The method of claim 8 further comprises defining an access specification for the user associated with the sensitive data based on the final privacy score.
12. The method of claim 8 further comprises masking the sensitive data based on the final privacy score and the access specification associated with the sensitive data for the user to control the exposure of the sensitive data to the user.
13. The method of claim 8 further comprises recording information of actions performed by the user related to the sensitive data, wherein the information comprises a database query, a type of attributes accessed, a size of a result set, a status of an operation performed by the user, restrictions of an operating table, restrictions of an operating column, restrictions of an operating schema, restrictions of an operating catalog, a number of columns accessed, a number of records accessed, a start time of the task, an end time of the task.
14. The method of claim 8 further comprises reducing the operational privacy score of the user based on number of privacy units accessed by the user while accessing the sensitive data to execute the task.
15. The method of claim 8 further comprises allocating a task to the user based on at least one of a trust score, a work experience score, one or more revisions in the trust score, one or more revisions in the operational privacy score, and the final privacy score.
16. The method of claim 8, wherein enforcing the final privacy score for the user comprises blocking one or more database queries of the user while accessing the sensitive data when the user is not having required final privacy score to execute the one or more database queries, and prohibiting access to the user for the sensitive data on exhaustion of the final privacy score.
17. A computer program product having embodied thereon a computer program controlling exposure of sensitive data to a user, the computer program product comprising:
a program code for receiving an initial privacy score and an operational privacy score for the user, wherein the initial privacy score indicates an amount of access initially available to the user associated with sensitive data, and the operational privacy score indicates an amount of access dynamically available to the user associated with the sensitive data, and wherein the sensitive data comprises a plurality of privacy units;
a program code for dynamically revising the operational privacy score for the user based on number of privacy units accessed by the user while executing a task, to generate a revised operational privacy score;
a program code for dynamically determining a final privacy score for the user based on the initial privacy score and the revised operational privacy score, wherein the final privacy score defines an amount of restriction to be imposed on actions of the user while accessing the sensitive data; and
a program code for enforcing the final privacy score for the user while accessing the sensitive data to control exposure of sensitive data to the user.
| # | Name | Date |
|---|---|---|
| 1 | 3723-MUM-2013-FORM 1(16-12-2013).pdf | 2013-12-16 |
| 2 | 3723-MUM-2013-CORRESPONDENCE(16-12-2013).pdf | 2013-12-16 |
| 3 | Form-2(Online).pdf | 2018-08-11 |
| 4 | Form 2.pdf | 2018-08-11 |
| 5 | Figure for Abstract.jpg | 2018-08-11 |
| 6 | 3723-MUM-2013-FORM 26(6-3-2014).pdf | 2018-08-11 |
| 7 | 3723-MUM-2013-CORRESPONDENCE(6-3-2014).pdf | 2018-08-11 |
| 8 | 3723-MUM-2013-FER.pdf | 2019-07-23 |
| 9 | 3723-MUM-2013-OTHERS [23-01-2020(online)].pdf | 2020-01-23 |
| 10 | 3723-MUM-2013-FER_SER_REPLY [23-01-2020(online)].pdf | 2020-01-23 |
| 11 | 3723-MUM-2013-COMPLETE SPECIFICATION [23-01-2020(online)].pdf | 2020-01-23 |
| 12 | 3723-MUM-2013-CLAIMS [23-01-2020(online)].pdf | 2020-01-23 |
| 13 | 3723-MUM-2013-US(14)-HearingNotice-(HearingDate-24-11-2021).pdf | 2021-10-21 |
| 14 | 3723-MUM-2013-FORM-26 [19-11-2021(online)].pdf | 2021-11-19 |
| 15 | 3723-MUM-2013-FORM-26 [19-11-2021(online)]-1.pdf | 2021-11-19 |
| 16 | 3723-MUM-2013-Correspondence to notify the Controller [19-11-2021(online)].pdf | 2021-11-19 |
| 17 | 3723-MUM-2013-US(14)-ExtendedHearingNotice-(HearingDate-29-11-2021).pdf | 2021-11-25 |
| 18 | 3723-MUM-2013-FORM-26 [26-11-2021(online)].pdf | 2021-11-26 |
| 19 | 3723-MUM-2013-FORM-26 [26-11-2021(online)]-1.pdf | 2021-11-26 |
| 20 | 3723-MUM-2013-Correspondence to notify the Controller [26-11-2021(online)].pdf | 2021-11-26 |
| 21 | 3723-MUM-2013-Written submissions and relevant documents [08-12-2021(online)].pdf | 2021-12-08 |
| 22 | 3723-MUM-2013-PatentCertificate22-03-2022.pdf | 2022-03-22 |
| 23 | 3723-MUM-2013-IntimationOfGrant22-03-2022.pdf | 2022-03-22 |
| 24 | 3723-MUM-2013-RELEVANT DOCUMENTS [30-09-2023(online)].pdf | 2023-09-30 |
| 1 | 3723mum2013_15-07-2019.pdf |