Abstract: A method and a privacy protection device (101) for generating privacy protected data for data analytics systems is disclosed. The method includes determining sensitive and non-sensitive features from user records associated with each of plurality of users. The user records through sensitive features comprising quasi identifiers and non-quasi sensitive identifiers are clustered into plurality of clusters. Each quasi identifier in each cluster is mapped with a calculated centroid of respective cluster and a randomly generated value, and each non-quasi sensitive identifier with corresponding mapped value and a randomly generated value. Each of the privacy protection devices (101) provide uniformly sampled user privacy protected records to corresponding machine learning models for generating data models. The amount of randomly generated noise which is added to user records is reduced based on privacy requirements of domain repositories, sensitive features and cohesion level of records of different users, leading to higher accuracy of data models and better performance of data driven applications. Fig.1A
Claims:1. A method of generating privacy protected data for data analytics systems, the method comprising:
determining, by one or more privacy protection devices (101), sensitive and non-sensitive features from user records associated with each of a plurality of users received from one or more sources (105);
clustering, by the one or more privacy protection devices (101), the user records through the sensitive features comprising quasi identifiers and non-quasi sensitive identifiers into a plurality of clusters using a predefined clustering technique;
mapping, by the one or more privacy protection devices (101), each quasi identifier in each cluster of the plurality of clusters with a calculated centroid of respective cluster and a randomly generated value, and each non-quasi sensitive identifier with corresponding mapped value and a randomly generated value, wherein each of the mapped quasi identifiers, mapped non-quasi sensitive identifiers and the non-sensitive features are used to generate user privacy protected records; and
providing, by each of the one or more privacy protection devices (101), uniformly sampled user privacy protected records to corresponding one or more machine learning models for generating one or more data models.
2. The method as claimed in claim 1, wherein the one or more sources (105) comprises public and private platforms, IOT gateway devices and repositories.
3. The method as claimed in claim 1, wherein the sensitive features comprise direct identifiers that enables to identify a user from the user records, the direct identifiers are eliminated before subjecting the user records for the clustering.
4. The method as claimed in claim 1, wherein the randomly generated value is determined based on cluster size, level of sensitivity of repository domain and degree of sensitivity of features in the user records.
5. The method as claimed in claim 1, wherein the user privacy protected records are uniformly sampled using a predefined aggregation technique.
6. A method for generating privacy protected data for data analytics systems, the method comprising:
receiving, by one or more analytical model (106), uniformly sampled user privacy protected records from corresponding one or more privacy protection devices (101), wherein the uniformly sampled user privacy protected records are generated from received user records using steps of claim 1;
processing, by each of the one or more analytical model (106), associated uniformly sampled user privacy protected records to generate a data model; and
transmitting, by each of the one more analytical model (106), the generated data model to a server (115), wherein the server (115) combines the data model received from each of the one or more analytical model (106) to generate a final data model that is used for one of machine learning predictions and recommendations.
7. A privacy protection device (101) for generating privacy protected data in data analytics systems, comprising:
a processor (113); and
a memory (111) communicatively coupled to the processor (113), wherein the memory (111) stores processor instructions, which, on execution, causes the processor (113) to:
determine sensitive and non-sensitive features from received user records associated with each of a plurality of users from one or more sources (105);
cluster the user records through the sensitive features comprising quasi identifiers and non-quasi sensitive identifiers into a plurality of clusters using a predefined clustering technique;
map each quasi identifier in each cluster of the plurality of clusters with a calculated centroid of respective cluster and a randomly generated value, and each non-quasi sensitive identifier with corresponding mapped value and a randomly generated value, wherein each of the mapped quasi identifiers, mapped non-quasi sensitive identifiers and the non-sensitive features are used to generate user privacy protected records; and
provide uniformly sampled user privacy protected records to corresponding one or more machine learning models for generating one or more data models.
8. The privacy protection device (101) as claimed in claim 7, wherein the one or more sources (105) comprises public and private platforms, IOT gateway devices and repositories.
9. The privacy protection device (101) as claimed in claim 7, wherein the sensitive features comprise direct identifiers that enables to identify a user from the user records, wherein the processor eliminates the direct identifiers before subjecting the user records for the clustering.
10. The privacy protection device (101) as claimed in claim 7, wherein the processor (113) determines the randomly generated value based on cluster size, level of sensitivity of repository domain, degree of sensitivity of features in the user records.
11. The privacy protection device (101) as claimed in claim 7, wherein the processor (113) uniformly samples the user privacy protected records using a predefined aggregation technique.
, Description:TECHNICAL FIELD
The present subject matter is related in general to data protection, more particularly, but not exclusively to a method and system for generating privacy protected data for data analytics systems.
BACKGROUND
With affordable availability of computer technology, storage, and network connectivity, there has been an exponential growth in volume and variety of data due to diverse applications of computers in all domain areas. Large scale data, which include user specific private and sensitive data is being stored in public domain and repositories. The data holder can release this data to a third-party data analyst to gain deeper insights and identify hidden patterns which are useful in making important decisions.
In recent times, governments across globe are supporting creation of smart cities that leverage huge amount of digital user data generated continuously by IoT devices in homes, vehicles, offices, health monitoring systems and the like. Smart city planning and operations also consume data from various public and private repositories/platforms storing demographic user records on financial, health, investment, insurance, social media, telecom, service provider quality and judiciary information. Typically, data analytics may utilize digital records from large number of users, perform analysis by creating demographic based machine learning models and build data-driven applications for planning, monitoring, resource utilization and proactive risk management. User records consumed by these Analytics contain user sensitive information. Allowing direct computations on user sensitive data expose privacy issues. Thus, with exponential growth of data, privacy preservation remains a crucial aspect.
Currently, many techniques and mechanisms exist which concentrate on the privacy persevering issue of data. For instance, anonymizing user records is one option, however anonymizing do not resolve privacy concerns. Even a small set of anonymized fields within such data records can be combined with similar fields in external repositories to re-identify users present in such external data sources and reveal user sensitive information to untrusted entities. One of conventional technique includes using differential privacy algorithms to avoid user reidentification and preserve privacy. However, such differential privacy algorithms used for transforming user data utilize same privacy parameters for different domain repositories, sensitive features, and all user records, although they vary in desired privacy requirements. As it accommodates only single level of privacy, minimum privacy loss is considered for all repositories, features, and user records. As a result, minimum privacy loss leads to high noise addition to user data. The high amounts of noise cause low prediction accuracy of machine learning models trained on such data and degrades performance of data-driven applications utilizing such models.
Further, transformed user data from such differential privacy algorithms is sent to a cloud server for executing machine learning analytics. Collecting all user information at a central server cause security issues (as attackers need to breach only the central server) and bandwidth issues for transferring huge amount of user data. Even though conventional systems try to resolve this issue by compressing data. However, even after compression, data volume remains high, especially when features of user record are independent of each other. This leads to reduction of dimensions of data which cause information loss leading to low model accuracy.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
SUMMARY
In an embodiment, the present disclosure may relate to a method for generating privacy protected data for data analytics systems. The method comprises determining sensitive and non-sensitive features from user records associated with each of a plurality of users received from one or more sources. The user records are clustered through the sensitive features comprising quasi identifiers and non-quasi sensitive identifiers into a plurality of clusters using a predefined clustering technique. Each quasi identifier in each cluster of the plurality of clusters is mapped with a calculated centroid of respective cluster and a randomly generated value, and each non-quasi sensitive identifier with corresponding mapped value and a randomly generated value. Each of the mapped quasi identifiers, mapped non-quasi sensitive identifiers and the non-sensitive features are used to generate user privacy protected records. Thereafter, a uniformly sampled user privacy protected records are provided to corresponding one or more machine learning models for generating one or more data models.
In an embodiment, the present disclosure may relate to a privacy protection device for generating privacy protected data for data analytics systems. The privacy protection device may comprise a plurality of processors and a memory communicatively coupled to the processors, where the memory stores processors executable instructions, which, on execution, may cause the privacy protection device to determine sensitive and non-sensitive features from user records associated with each of a plurality of users received from one or more sources. The user records are clustered through the sensitive features comprising quasi identifiers and non-quasi sensitive identifiers into a plurality of clusters using a predefined clustering technique. The privacy protection device maps each quasi identifier in each cluster of the plurality of clusters with a calculated centroid of respective cluster and a randomly generated value, and each non-quasi sensitive identifier with corresponding mapped value and a randomly generated value. Each of the mapped quasi identifiers, mapped non-quasi sensitive identifiers and the non-sensitive features are used to generate user privacy protected records. Thereafter, the privacy protection device provides a uniformly sampled user privacy protected records to corresponding one or more machine learning models for generating one or more data models.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and with reference to the accompanying figures, in which:
Fig.1A illustrates an exemplary environment for generating privacy protected data for data analytics systems in accordance with some embodiments of the present disclosure;
Fig.1B illustrates an exemplary environment for transmitting generated data model to a server in accordance with some embodiments of the present disclosure;
Fig.2 illustrates a detailed block diagram of a privacy protection device in accordance with some embodiments of the present disclosure;
Fig.3 shows an exemplary embodiment for demographic based data analytics in accordance with some embodiments of the present disclosure;
Fig.4A illustrates a flowchart showing a method for generating privacy protected data for data analytics systems in accordance with some embodiments of present disclosure;
Fig.4B illustrates a flowchart showing a method for transmitting generated data model to a server in accordance with some embodiments of present disclosure; and
Fig.5 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and shows by way of illustration, specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
Embodiments of the present disclosure relates to a method and a privacy protection device for generating privacy protected data in data analytics systems. In an embodiment, data analytics may refer as a process of examining data sets in order to draw conclusions and insights about information they contain, and to make decisions with aid of specialized systems. Generally, data analytics may utilize digital records from large number of users, perform analysis by creating demographic based machine learning models and build data-driven applications for planning, monitoring, resource utilization and proactive risk management. The dataset consumed by these analytics contain user sensitive information. Typically, any dataset used for computations is exposed to privacy issues. Currently, many existing systems provide privacy protection of user sensitive data. However, these systems utilize same privacy parameters for different domain repositories, sensitive features, and user records, although they vary in desired privacy requirements. As these systems accommodate only single level of privacy, minimum privacy loss is considered for all repositories, features, and user records. As a result, minimum privacy loss introduces high amounts of noise in the dataset which causes low prediction accuracy of trained machine learning models and degrades performance of data-driven applications utilizing such models. In addition, the existing systems transfer protected user data to a central server for executing machine learning analytics. However, collecting all user information at the central server causes security and bandwidth issues.
Thus, the present invention in such case, determines sensitive and non-sensitive features from user records and clusters the user records through the sensitive features into a plurality of clusters. The sensitive features of the user records are mapped to either a calculated centroid of respective cluster or a corresponding value and a randomly generated value, depending on type of sensitive features. The randomly generated value refers to differential private noise value. These noise additions are based on privacy requirements of domain repositories, sensitive features, and user records. User privacy protected records is generated based on the mapped sensitive features and non-sensitive features. Thereafter, a uniformly sampled user privacy protected records is provided to corresponding one or more machine learning models for generating one or more data models. As a result, the present disclosure reduces noise additions to user records, thus increases prediction accuracy of trained data models and improves performance of data driven applications, while preserving user privacy and avoiding user reidentification. Also, the present disclosure reduces bandwidth requirement with high data security since only the data models are provided to the server for analytics, and not entire user records.
Fig.1A illustrates an exemplary environment for generating privacy protected data for data analytics systems in accordance with some embodiments of the present disclosure.
As shown in Fig.1A, an environment 100 includes one or more privacy protection devices 101 (such as, a privacy protection device 1011, a privacy protection device 1012………………………. and a privacy protection device 101N) connected through a peer to peer network (not shown explicitly in Fig.1A) to corresponding one or more analytical models 106 (such as, analytical model 1061, analytical model 1062,………………………and analytical model 106N). The one or more analytical models 106 may be any type of analytical systems which may generate one or more data models based on different user records received from corresponding privacy protection device. In an embodiment, the one or more analytical models 106 may be trained for specific type of user record. Further, each of the one or more privacy protection devices 101 are connected to corresponding one or more sources 105 (through a communication network, not shown explicitly), such as privacy protection device 1011 connected to one or more sources 1051, the privacy protection device 1012 connected to one or more sources 1052 and the like. Typically, most of user records are continuously collected and stored in different sources. The one or more sources 105 may include, but not limited to, public and private platforms, IOT gateway devices and repositories. For example, an entity such as, a smart hospital may maintain repository or an IOT gateway to collect and maintain records of all patients. The records of the patients may be utilised by different data analytic systems for generating data models. In such situations, the one or more privacy protection devices 101 generate privacy protected data. The one or more privacy protection devices 101 may be any computing devices such as, a laptop, a desktop computer, a notebook, a smartphone, a tablet, a server, and any other computing devices. A person skilled in the art would understand that, any other devices, not mentioned explicitly, may also be used as the one or more privacy protection devices 101 in the present disclosure. In an embodiment, the one or more privacy protection devices 101 may be standalone devices. Alternatively, the one or more privacy protection devices 101 may be configured within the corresponding one or more sources 105.
Further, the privacy protection device 1011 includes an I/O interface 1091, a memory 1111 and a processor 1131. Likewise, each of the one or more privacy protection devices 101 may include corresponding I/O interface, memory, and processor (collectively referred as I/O interface 109, a memory 111 and a processor 113). The I/O interface 109 may be configured to receive user records from respective one or more sources 105. The user records received from the I/O interface 109 may be stored in the memory 111. The memory 111 may be communicatively coupled to processor 113 of respective one or more privacy protection devices 101. The memory 111 may also store processor instructions which may cause the processor 113 to execute the instructions for generating the privacy protected data.
Generally, user records of multiple users are required for performing any type of data analytics. When user records of a plurality of users is requested from any of the one or more sources 105 for data analytics, the user records are passed through the corresponding one or more privacy protection devices 101. In an embodiment, depending on type of data analytics and request, number of sources may vary and accordingly corresponding privacy protection devices. On receiving the user records from respective one or more sources 105, the one or more privacy protection devices 101 may process the user records and determine sensitive and non-sensitive features associated with the user records. In an embodiment, any existing known technique may be applied on the user records to determine the sensitive and non-sensitive features. The sensitive features may include direct identifiers, quasi identifiers, and non-quasi sensitive identifiers. The direct identifiers enable to identify a user from the user records. As an example, in a company, identifiers such as, customer name, customer ID, bank account number and the like.
The quasi identifiers are features which are not directly associated with users, but which can be correlated with a user that they can be combined with other quasi identifiers to identify the user. For example, identifier such as, age, gender, address of the customer. The non-quasi sensitive identifiers are features which are not directly associated with users but with the domain/field and can be used along with quasi features to identify the user. For instance, identifier such as, new/old customer, item purchased by the customer and the like. Upon determining the sensitive and non-sensitive features, the one or more privacy protection devices 101 may eliminate the direct identifiers from the user records. Further, the quasi identifiers and non-quasi sensitive identifiers are clustered into a plurality of clusters. The clustering may be performed using a predefined clustering technique such as, Agglomerative Hierarchical Clustering (AHC). A person skilled in the art would understand that, any other clustering techniques, not mentioned explicitly, may also be used for clustering the sensitive features in the present disclosure. Upon clustering, the one or more privacy protection devices 101 calculates a centroid for each cluster using any existing technique. In addition, a randomly generated value is determined for each quasi identifier and the non-quasi sensitive identifier. In an embodiment, the randomly generated value is a differential private noise value which is selected from a Gaussian distribution. The randomly generated value is determined based on cluster size; level of sensitivity of repository domain, such as, critical, high, medium, and low. For instance, health domain records are considered to be critical over recommendation domain records. The cluster size may define number of data points in each cluster. In an embodiment, a dense cluster may require a low noise scale and outliers require high noise scale. Further, the randomly generated value is determined based on degree of sensitivity of features in the user records, such as, critical, high, medium, and low. For instance, location information feature is more sensitive than shopping information feature.
The one or more privacy protection devices 101 map each quasi identifier in each cluster with the calculated centroid of respective cluster and the randomly generated value, and each non-quasi sensitive identifier with corresponding mapped value and the randomly generated value associated with respective identifier. Once the sensitive features are mapped, the one or more privacy protection devices 101 may generate a uniformly sampled user privacy protected records using each of the mapped quasi identifiers, mapped non-quasi sensitive identifiers and the non-sensitive features. The user privacy protected records are uniformly sampled with mapped values so that total number of records remains same. The user privacy protected records are uniformly sampled using a predefined aggregation technique such as, Bootstrap Aggregation (BT) technique. Thereafter, the uniformly sampled user privacy protected records are provided by each of the one or more privacy protection devices 101 to corresponding one or more analytical models 106 to generate one or more data models. Fig.1B illustrates an exemplary environment for transmitting generated data model to a server in accordance with some embodiments of the present disclosure. The one or more analytical models 106 may process the user privacy protected records using a neural network to generate the data model. The neural network may include a predefined architecture in terms of number of hidden layers, number of neurons in each layer, initial weights; predefined hyperparameters and activation functions and the like. Thus, each of the one or more analytical models 106 transmit the generated data model parameters to a server 115. In addition, each of the one or more analytical models 106 transmit architecture details, hyperparameters and activation functions of the neural network to the server 115 so as to reconstruct the data model. The server 115 combines the data models received from each of the one or more analytical models to generate the final data model that is used for one of machine learning predictions and recommendations.
Fig.2 illustrates a detailed block diagram of a privacy protection device in accordance with some embodiments of the present disclosure.
As shown in Fig.2, one or more privacy protection devices 101 may include data 200 and one or more modules 211 which are described herein in detail. In an embodiment, data 200 may be stored within the memory 111. The data 200 may include, for example, user data 201, sensitive data 203, non-sensitive data 205, privacy protected record 207 and other data 209.
The user data 201 may include the user records received from the one or more sources 105. The user records are associated with the plurality of users. The user records may include sensitive and non-sensitive information associated with each of the plurality of users. The user data 201 may be associated with any domain. For example, for emergency services and health insurance planning, user records from a smart city hospital may be collected and provided to the one or more privacy protection devices 101. The user records in such case may include, demographic level user health records (e.g. National Health Stack), live accident data, ambulance emergency records, health gadget monitoring (IoT Gateways) of users and the like.
Likewise, in another example, for analysing advertisements based on demographic user preferences, the user records from the one or more sources 105 may include, e-commerce location wise user purchases, current shopping trends which may be collected from IoT sensor deployment in malls, and the like.
The sensitive data 203 may include direct identifiers, quasi identifiers, and non-quasi sensitive identifiers. The direct identifiers enable to identify a user from the user records directly. For example, in a company, identifiers such as, customer name, customer ID, bank account number and the like.
The quasi identifiers are features which are not directly associated with users, but which can be correlated with a user that they can be combined with other quasi-identifiers to identify the user. For example, considering the above example, identifier such as, age, gender, and address of the customer. The non-quasi sensitive identifiers are features which are not directly associated with users but with the domain/field and can be used along with quasi features to identify the user. For instance, considering above example, identifier such as, new/old customer, item purchased by the customer and the like.
The non-sensitive data 205 may include the user record which cannot be correlated to identify the user. For example, considering the above example, features such as, amount paid by customer, feedback for purchase item, item details and the like.
The privacy protected record 207 may include the uniformly sampled user privacy protected records generated for the received user records.
The other data 209 may store data, including temporary data and temporary files, generated by one or more modules 211 for performing the various functions of the one or more privacy protection devices 101.
In an embodiment, the data 200 in the memory 111 are processed by the one or more modules 211 present within the memory 111 of the one or more privacy protection devices 101. In an embodiment, the one or more modules 211 may be implemented as dedicated units. As used herein, the term module refers to an application specific integrated circuit (ASIC), an electronic circuit, a field-programmable gate arrays (FPGA), Programmable System-on-Chip (PSoC), a combinational logic circuit, and/or other suitable components that provide the described functionality. In some implementations, the one or more modules 211 may be communicatively coupled to the processor 113 for performing one or more functions of the one or more privacy protection devices 101. The one or more modules 211 when configured with the functionality defined in the present disclosure will result in a novel hardware.
In one implementation, the one or more modules 211 may include, but are not limited to a communication module 213, a feature determination module 215, a clustering module 217, a value generation module 219, a mapping module 221 and a protected data providing module 223. The one or more modules 211 may also include other modules 225 to perform various miscellaneous functionalities of the one or more privacy protection devices 101. In an embodiment, the other modules 225 may include a centroid determination module which may calculate a centroid for each cluster of the plurality of clusters.
The communication module 213 may receive the user records from corresponding one or more sources 105. Further, the communication module 213 may transmit the uniformly sampled user privacy protected records to corresponding one or more analytical models 106 for processing and generating one or more data models. The communication module 213 transmits the received user records to the feature determination module 215.
The feature determination module 215 may determine features of the user records by processing the user records. The feature determination module 215 may determine the sensitive and non-sensitive features of the user records. In an embodiment, the feature determination module 215 may process the user records using any existing techniques to determine the sensitive and the non-sensitive features of the user records. The sensitive features of the user records may further be processed to determine the direct identifiers, the quasi identifiers, and the non-quasi sensitive identifiers. Upon determining the sensitive features, the feature determination module 215 may eliminate the direct identifiers from the user records.
The clustering module 217 may cluster the user records through the quasi identifiers and the non-quasi identifier into the plurality of clusters. The clustering module 217 may cluster using the Agglomerative Hierarchical Clustering (AHC) technique.
The value generation module 219 may generate a random noise value for each of the quasi identifier and the non-quasi sensitive identifiers. In an embodiment, the randomly generated value is a differential private noise value which is selected from a Gaussian distribution. The value generation module 219 may generate the randomly generated value based on the cluster size, level of sensitivity of repository domain and degree of sensitivity of features in the user records. The cluster size may define number of data points in each cluster. In an embodiment, a dense cluster may require a low noise scale and outliers require high noise scale. The sensitivity of repository domain may be defined such as, critical, high, medium, and low. For instance, health domain records are critical records. The degree of sensitivity of features in the user records may be defined such as, critical, high, medium, and low. For instance, location information is a critical feature. In an embodiment, the randomly generated value may be a numerical value.
The mapping module 221 may receive the centroid value for each cluster from the centroid determination module and the randomly generated value from the value generation module 219. Thus, upon receiving these information, the mapping module 221 may map each quasi identifier in each cluster with the centroid of respective cluster and the associated randomly generated value. For instance, consider, the centroid value for a cluster is “70” and the randomly generated value is “2.5”. In such case, the value for corresponding quasi identifier is mapped as “72.5”. Also, the mapping module 221 maps each non-quasi sensitive identifier with corresponding mapped value and corresponding randomly generated value. For instance, consider, the value for a non-quasi sensitive identifier such as, “medicines purchased by customer” is “45” and the randomly generated value is “-3”. In such case, the value for the non-quasi sensitive identifier is mapped as “42”.
The protected data providing module 223 generate the uniformly sampled user privacy protected records using the mapped quasi identifiers, the mapped non-quasi sensitive identifiers, and the non-sensitive features. The protected data providing module 223 may generate the uniformly sampled user privacy protected records by using predefined aggregation technique such as, Bootstrap Aggregation (BT) technique. An example, of the aggregation of user records is provided in Fig.3 subsequently.
Fig.3 shows an exemplary embodiment for demographic based data analytics in accordance with some embodiments of the present disclosure.
Fig.3 shows an exemplary representation for generating smart city advertisements based on demographic user preferences. Particularly, for building a smart city data-driven application which publishes relevant regional advertisements based on machine learning model predictions computed from historical data and current trends in preferences of users in the region. As user records are sensitive, the data in such case is provided to the one or more privacy protection devices 101 for generating privacy protected data before providing to the data driven applications. Consider, the user records accessed are historical data and current shopping records of users and current trends from IoT sensors installed in departmental stores. The historical data and current shopping records may be accessed from grocery E-commerce repositories 301, medicine E-commerce repositories 303, travel bookings repositories 305 and the like. The current trends from departmental stores may be accessed through IoT gateways 307 for the household items, medicine stocks and the like.
The user records received from each of the repositories may be processed to determine the sensitive features such as, direct identifiers, quasi identifiers and non-quasi sensitive identifiers, and non-sensitive features, associated sensitivity levels of features along with sensitivity levels of associated domains of the user records. For instance, consider the below determination of sensitive and non-sensitive features.
For medicine E-commerce repository:
1) Customer ID.
2) Customer Name.
3) Age.
4) Gender.
5) Address.
6) Credit/Debit Card.
7) New/Old Customer.
8) Medicine Purchased.
9) Amount of Medicine.
10) Date and Time.
11) Amount Paid.
12) EMI Service Opted.
13) Registered for Mediclaim Policy.
14) Claimed any E-commerce Vouchers.
15) Feedback of E-commerce service.
For grocery E-commerce repository:
1) Customer ID.
2) Customer Name.
3) Age.
4) Gender.
5) Address.
6) Credit/Debit Card.
7) Grocery Item 1.
8) Amount of Item 1.
9) Grocery Item 2.
10) Amount of Item 2.
11) Grocery Item 3.
12) Amount of Item 3.
13) Claimed any E-commerce Vouchers.
14) Feedback of E-commerce service.
For travel bookings repository: ;
For medicine stocks from Departmental Stores. ;
For household items from Departmental Stores. .
Thereafter, the one or more privacy protection devices 101 may eliminate the direct identifiers as mentioned above. Table 1 below shows an exemplary distribution of quasi identifiers, non-quasi sensitive identifiers and non-sensitive features as identified above.
QC QH QM NQH NQH NQC NSF NSF NSF NSF NSF NSF NSF NSF NSF
Table 1
Where, Domain Sensitivity : Critical/High/Medium/Low;
Quasi Identifier : Critical/High/Medium/Low;
Non-Quasi Sensitive Identifier : Critical/High/Medium/Low;
Non-Sensitive Feature: NSF; and
Each table row is a user record and each table column is a feature.
Further, the one or more privacy protection devices 101 may cluster the quasi identifiers and non-quasi sensitive identifiers into the plurality of clusters. For instance, in the current scenario, three clusters are identified, namely, row 1 and 2 corresponding to first cluster, row 3, 4 and 5 corresponding to second cluster and row 6 corresponding to third cluster.
Table 2 below shows the different clusters in different shades.
QC QH QM NQH NQH NQC NSF NSF NSF NSF NSF NSF NSF NSF NSF
V1,1 V1,2 V1,3 V1,4 V1,5 V1,6 V1,7 V1,8 V1,9 V1,10 V1,11 V1,12 … … V1, F
V2,1 V2,2 V2,3 V2,4 V2,5 V2,6 V2,7 V2,8 V2,9 V2,10 V2,11 V2,12 … … V2, F
V3,1 V3,2 V3,3 V3,4 V3,5 V3,6 V3,7 V3,8 V3,9 V3,10 V3,11 V3,12 … … V3, F
V4,1 V4,2 V4,3 V4,4 V4,5 V4,6 V4,7 V4,8 V4,9 V4,10 V4,11 V4,12 … … V4, F
V5,1 V5,2 V5,3 V5,4 V5,5 V5,6 V5,7 V5,8 V5,9 V5,10 V5,11 V5,12 … … V5, F
V6,1 V6,2 V6,3 V6,4 V6,5 V6,6 V6,7 V6,8 V6,9 V6,10 V6,11 V6,12 … … V6, F
Table 2
where, N: Number of sensitive features
M: Number of non-sensitive features
F: Total number of features (Sensitive + Non-Sensitive): N + M
Va, b: Value of bah feature of ath user record.
Further, the centroid value is calculated for each cluster “1, 2 and 3”.
Cluster C1 …………………………………………………. (1)
Cluster C2 …………………………………………………. (2)
Cluster C3 ………………………………………………… (3)
The one or more privacy protection devices 101 maps each quasi identifier and non-quasi sensitive identifier as below:
For each quasi identifier:
Replace Vi, j with Ct, j + Ni, j ;
where, Ni, j = Randomly generated value selected from Gaussian distribution
In an embodiment, the Gaussian Distribution Scale = f (Parameters of Selected Differential Privacy Algorithm {epsilon E, delta d, Differential Diameter R} and Privacy Requirements {Cluster Size, Domain Sensitivity, Feature Sensitivity}). Higher the sensitivity level, higher is the noise scale.
For each non-quasi sensitive identifier:
Replace Vi, j with Vi, j + Ni, j ;
where, Ni, j = Randomly generated value selected from Gaussian distribution
In an embodiment, the Gaussian Distribution Scale = f (Parameters of Selected Differential Privacy Algorithm {epsilon E, delta d, Differential Diameter R} and Privacy Requirements {Cluster Size, Domain Sensitivity, Feature Sensitivity}). Higher the sensitivity level, higher is the noise scale.
Table 3 below shows representation of user records after mapping.
QC QH QM NQH NQH NQC NSF NSF NSF NSF NSF NSF NSF NSF NSF
C1,1+N1,1 C1,2+N1,2 C1,3+N1,3 V1,4+N1,4 V1,5+N1,5 V1,6+N1,6 V1,7 V1,8 V1,9 V1,10 V1,11 V1,12 … … V1, F
C1,1+N2,1 C1,2+N2,2 C1,3+N2,3 V2,4+N2,4 V2,5+N2,5 V2,6+N2,6 V2,7 V2,8 V2,9 V2,10 V2,11 V2,12 … … V2, F
C2,1+N3,1 C2,2+N3,2 C2,3+N3,3 V3,4+N3,4 V3,5+N3,5 V3,6+N3,6 V3,7 V3,8 V3,9 V3,10 V3,11 V3,12 … … V3, F
C2,1+N4,1 C2,2+N4,2 C2,3+N4,3 V4,4+N4,4 V4,5+N4,5 V4,6+N4,6 V4,7 V4,8 V4,9 V4,10 V4,11 V4,12 … … V4, F
C2,1+N5,1 C2,2+N5,2 C2,3+N5,3 V5,4+N5,4 V5,5+N5,5 V5,6+N5,6 V5,7 V5,8 V5,9 V5,10 V5,11 V5,12 … … V5, F
C3,1+N6,1 C3,2+N6,2 C3,3+N6,3 V6,4+N6,4 V6,5+N6,5 V6,6+N6,6 V6,7 V6,8 V6,9 V6,10 V6,11 V6,12 … … V6, F
Table 3
Thereafter, the user records after adding the randomly generated values are uniformly sampled using bootstrap aggregation technique to transmit to the one or more analytical models 106 for generating models. For instance, the user records before and after sampling is represented as below.
Before:
After: , such that, U4: Not Selected, U7: Selected two times.
Thus, the one or more privacy protection devices 101 provides the uniformly sampled user privacy protected records to the one or more analytical models 106 which may generate the data models and transmit to the server 115. The server 115 may receive the data models from each of the one or more analytical models 106 to generate the final model which helps to provide suggestions for advertisements at regional level, without disclosing details of individual users, and avoiding reidentification and thus maintaining user privacy.
Fig.4A illustrates a flowchart showing a method for generating privacy protected data for data analytics systems in accordance with some embodiments of present disclosure.
As illustrated in Fig.4, the method 400 includes one or more blocks for generating privacy protected data for data analytics systems. The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 401, the sensitive and the non-sensitive features from the user records are determined by the feature determination module 215. The user records are received for the plurality of users from the one or more sources 105.
At block 403, the user records through the quasi identifiers and non-quasi sensitive identifiers are clustered by the clustering module 217 into the plurality of clusters using the predefined clustering technique such as, Agglomerative Hierarchical Clustering (AHC).
At block 405, each quasi identifier in each cluster of the plurality of clusters is mapped by the mapping module 221 with the calculated centroid of respective cluster and the randomly generated value. Also, each non-quasi sensitive identifier is mapped with corresponding mapped value and the randomly generated value. The randomly generated value is determined based on the cluster size, the level of sensitivity of repository domain and the degree of sensitivity of features in the user records. Subsequently, each of the mapped quasi identifiers, the mapped non-quasi sensitive identifiers and the non-sensitive features are used to generate user privacy protected records.
At block 407, the uniformly sampled user privacy protected records are generated and provided by the protected data providing module 223 to corresponding one or more analytical models 106 for generating one or more data models. The uniformly sampled user privacy protected records are generated by using the predefined aggregation technique.
Fig.4B illustrates a flowchart showing a method for transmitting generated data model to a server in accordance with some embodiments of present disclosure.
At block 409, the uniformly sampled user privacy protected records are received by the one or more analytical models 106 from corresponding one or more privacy protection devices 101. The uniformly sampled user privacy protected records are generated from received user records using steps as disclosed above in Fig.4A.
At block 411, the uniformly sampled user privacy protected records are processed by the one or more analytical models 106 to generate the data model.
At block 413, each of the generated data models are transmitted by the one or more analytical models 106 to the server 115. The server 115 combines the data models received from each of the one or more analytical models 106 to generate the final data model that is used for one of machine learning predictions and recommendations.
Computing System
Fig.5 illustrates a block diagram of an exemplary computer system 500 for implementing embodiments consistent with the present disclosure. In an embodiment, the computer system 500 may be one or more privacy protection devices 101. The computer system 500 may include a central processing unit (“CPU” or “processor”) 502. The processor 502 may include at least one data processor for generating privacy protected data for data analytics systems. The processor 502 may include specialized processing units such as, integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
The processor 502 may be disposed in communication with one or more input/output (I/O) devices (not shown) via I/O interface 501. The I/O interface 501 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monoaural, RCA, stereo, IEEE-1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), RF antennas, S-Video, VGA, IEEE 802.n /b/g/n/x, Bluetooth, cellular (e.g., code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc.
Using the I/O interface 501, the computer system 500 may communicate with one or more I/O devices such as input devices 512 and output devices 513. For example, the input devices 512 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, stylus, scanner, storage device, transceiver, video device/source, etc. The output devices 513 may be a printer, fax machine, video display (e.g., Cathode Ray Tube (CRT), Liquid Crystal Display (LCD), Light-Emitting Diode (LED), plasma, Plasma Display Panel (PDP), Organic Light-Emitting Diode display (OLED) or the like), audio speaker, etc.
In some embodiments, the processor 502 may be disposed in communication with a peer network via a network interface 503. The network interface 503 may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc. The computer system 500 may communicate with corresponding one or more sources 514 and one or more analytical models 515 through peer to peer to network. The network interface 503 may employ connection protocols include, but not limited to, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc.
In some embodiments, the processor 502 may be disposed in communication with a memory 505 (e.g., RAM, ROM, etc. not shown in fig.5) via a storage interface 504. The storage interface 504 may connect to memory 505 including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as, serial advanced technology attachment (SATA), Integrated Drive Electronics (IDE), IEEE-1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI), etc. The memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, Redundant Array of Independent Discs (RAID), solid-state memory devices, solid-state drives, etc.
The memory 505 may store a collection of program or database components, including, without limitation, user interface 506, an operating system 507 etc. In some embodiments, computer system 500 may store user/application data, such as, the data, variables, records, etc., as described in this disclosure. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase.
The operating system 507 may facilitate resource management and operation of the computer system 500. Examples of operating systems include, without limitation, APPLE MACINTOSHR OS X, UNIXR, UNIX-like system distributions (E.G., BERKELEY SOFTWARE DISTRIBUTIONTM (BSD), FREEBSDTM, NETBSDTM, OPENBSDTM, etc.), LINUX DISTRIBUTIONSTM (E.G., RED HATTM, UBUNTUTM, KUBUNTUTM, etc.), IBMTM OS/2, MICROSOFTTM WINDOWSTM (XPTM, VISTATM/7/8, 10 etc.), APPLER IOSTM, GOOGLER ANDROIDTM, BLACKBERRYR OS, or the like.
In some embodiments, the computer system 500 may implement a web browser 508 stored program component. The web browser 508 may be a hypertext viewing application, for example MICROSOFT® INTERNET EXPLORERTM, GOOGLE® CHROMETM, MOZILLA® FIREFOXTM, APPLE® SAFARITM, etc. Secure web browsing may be provided using Secure Hypertext Transport Protocol (HTTPS), Secure Sockets Layer (SSL), Transport Layer Security (TLS), etc. Web browsers 708 may utilize facilities such as AJAXTM, DHTMLTM, ADOBE® FLASHTM, JAVASCRIPTTM, JAVATM, Application Programming Interfaces (APIs), etc. In some embodiments, the computer system 500 may implement a mail server stored program component. The mail server may be an Internet mail server such as Microsoft Exchange, or the like. The mail server may utilize facilities such as ASPTM, ACTIVEXTM, ANSITM C++/C#, MICROSOFT®, .NETTM, CGI SCRIPTSTM, JAVATM, JAVASCRIPTTM, PERLTM, PHPTM, PYTHONTM, WEBOBJECTSTM, etc. The mail server may utilize communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), MICROSOFT® exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like. In some embodiments, the computer system 500 may implement a mail client stored program component. The mail client may be a mail viewing application, such as APPLE® MAILTM, MICROSOFT® ENTOURAGETM, MICROSOFT® OUTLOOKTM, MOZILLA® THUNDERBIRDTM, etc.
Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
An embodiment of the present disclosure improves performance of data driven applications, for instance, increases prediction/recommendation accuracy, while preserving user privacy, with high data security and low bandwidth requirements.
An embodiment of the present disclosure reduces bandwidth requirement, since only the data models are provided to the server for analytics, and not entire user records.
An embodiment of the present disclosure maintains user privacy by transforming data using differentially private noise additions.
In an embodiment, amount of randomly generated noise which is added to user records is reduced based on privacy requirements, leading to higher accuracy of machine learning models and better performance of data driven applications.
An embodiment of present disclosure push computations towards data sources, thus increasing data security and reducing bandwidth requirements.
The described operations may be implemented as a method, system or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. The described operations may be implemented as code maintained in a “non-transitory computer readable medium”, where a processor may read and execute the code from the computer readable medium. The processor is at least one of a microprocessor and a processor capable of processing and executing the queries. A non-transitory computer readable medium may include media such as magnetic storage medium (e.g., hard disk drives, floppy disks, tape, etc.), optical storage (CD-ROMs, DVDs, optical disks, etc.), volatile and non-volatile memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic, etc.), etc. Further, non-transitory computer-readable media include all computer-readable media except for a transitory. The code implementing the described operations may further be implemented in hardware logic (e.g., an integrated circuit chip, Programmable Gate Array (PGA), Application Specific Integrated Circuit (ASIC), etc.).
Still further, the code implementing the described operations may be implemented in “transmission signals”, where transmission signals may propagate through space or through a transmission media, such as, an optical fibre, copper wire, etc. The transmission signals in which the code or logic is encoded may further include a wireless signal, satellite transmission, radio waves, infrared signals, Bluetooth, etc. The transmission signals in which the code or logic is encoded is capable of being transmitted by a transmitting station and received by a receiving station, where the code or logic encoded in the transmission signal may be decoded and stored in hardware or a non-transitory computer readable medium at the receiving and transmitting stations or devices. An “article of manufacture” includes non-transitory computer readable medium, hardware logic, and/or transmission signals in which code may be implemented. A device in which the code implementing the described embodiments of operations is encoded may include a computer readable medium or hardware logic. Of course, those skilled in the art will recognize that many modifications may be made to this configuration without departing from the scope of the invention, and that the article of manufacture may include suitable information bearing medium known in the art.
The terms “an embodiment”, “embodiment”, “embodiments”, “the embodiment”, “the embodiments”, “one or more embodiments”, “some embodiments”, and “one embodiment” mean “one or more (but not all) embodiments of the invention(s)” unless expressly specified otherwise.
The terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless expressly specified otherwise.
The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise.
The terms “a”, “an” and “the” mean “one or more”, unless expressly specified otherwise.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
The illustrated operations of Fig.5 show certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified, or removed. Moreover, steps may be added to the above described logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
REFERRAL NUMERALS:
Reference number Description
100 Environment
101 One or more privacy protection devices
105 One or more sources
106 One or more analytical models
109 I/O interface
111 Memory
113 Processor
115 Server
200 Data
201 User data
203 Sensitive data
205 Non-sensitive data
207 Privacy protected record
209 Other data
211 Modules
213 Communication module
215 Feature determination module
217 Clustering module
219 Value generation module
221 Mapping module
223 Protected data providing module
225 Other modules
500 Computer system
501 I/O interface
502 Processor
503 Network interface
504 Storage interface
505 Memory
506 User interface
507 Operating system
508 Web browser
512 Input devices
513 Output devices
514 One or more sources
515 One or more analytical devices
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 202041037547-Correspondence to notify the Controller [11-11-2024(online)].pdf | 2024-11-11 |
| 1 | 202041037547-STATEMENT OF UNDERTAKING (FORM 3) [31-08-2020(online)].pdf | 2020-08-31 |
| 1 | 202041037547-US(14)-HearingNotice-(HearingDate-19-11-2024).pdf | 2024-11-05 |
| 2 | 202041037547-CLAIMS [11-07-2022(online)].pdf | 2022-07-11 |
| 2 | 202041037547-REQUEST FOR EXAMINATION (FORM-18) [31-08-2020(online)].pdf | 2020-08-31 |
| 2 | 202041037547-US(14)-HearingNotice-(HearingDate-19-11-2024).pdf | 2024-11-05 |
| 3 | 202041037547-CLAIMS [11-07-2022(online)].pdf | 2022-07-11 |
| 3 | 202041037547-COMPLETE SPECIFICATION [11-07-2022(online)].pdf | 2022-07-11 |
| 3 | 202041037547-POWER OF AUTHORITY [31-08-2020(online)].pdf | 2020-08-31 |
| 4 | 202041037547-FORM 18 [31-08-2020(online)].pdf | 2020-08-31 |
| 4 | 202041037547-CORRESPONDENCE [11-07-2022(online)].pdf | 2022-07-11 |
| 4 | 202041037547-COMPLETE SPECIFICATION [11-07-2022(online)].pdf | 2022-07-11 |
| 5 | 202041037547-FORM 1 [31-08-2020(online)].pdf | 2020-08-31 |
| 5 | 202041037547-Correspondence_Form-1 And POA_11-07-2022.pdf | 2022-07-11 |
| 5 | 202041037547-CORRESPONDENCE [11-07-2022(online)].pdf | 2022-07-11 |
| 6 | 202041037547-DRAWINGS [31-08-2020(online)].pdf | 2020-08-31 |
| 6 | 202041037547-DRAWING [11-07-2022(online)].pdf | 2022-07-11 |
| 6 | 202041037547-Correspondence_Form-1 And POA_11-07-2022.pdf | 2022-07-11 |
| 7 | 202041037547-FER_SER_REPLY [11-07-2022(online)].pdf | 2022-07-11 |
| 7 | 202041037547-DRAWING [11-07-2022(online)].pdf | 2022-07-11 |
| 7 | 202041037547-DECLARATION OF INVENTORSHIP (FORM 5) [31-08-2020(online)].pdf | 2020-08-31 |
| 8 | 202041037547-COMPLETE SPECIFICATION [31-08-2020(online)].pdf | 2020-08-31 |
| 8 | 202041037547-FER_SER_REPLY [11-07-2022(online)].pdf | 2022-07-11 |
| 8 | 202041037547-FORM 3 [08-07-2022(online)].pdf | 2022-07-08 |
| 9 | 202041037547-FORM 3 [08-07-2022(online)].pdf | 2022-07-08 |
| 9 | 202041037547-PETITION UNDER RULE 137 [08-07-2022(online)].pdf | 2022-07-08 |
| 9 | 202041037547-Proof of Right [07-10-2020(online)].pdf | 2020-10-07 |
| 10 | 202041037547-PETITION UNDER RULE 137 [08-07-2022(online)].pdf | 2022-07-08 |
| 10 | 202041037547-RELEVANT DOCUMENTS [08-07-2022(online)].pdf | 2022-07-08 |
| 10 | 202041037547-Request Letter-Correspondence [09-06-2021(online)].pdf | 2021-06-09 |
| 11 | 202041037547-FER.pdf | 2022-04-12 |
| 11 | 202041037547-Power of Attorney [09-06-2021(online)].pdf | 2021-06-09 |
| 11 | 202041037547-RELEVANT DOCUMENTS [08-07-2022(online)].pdf | 2022-07-08 |
| 12 | 202041037547-Abstract.jpg | 2021-10-18 |
| 12 | 202041037547-FER.pdf | 2022-04-12 |
| 12 | 202041037547-Form 1 (Submitted on date of filing) [09-06-2021(online)].pdf | 2021-06-09 |
| 13 | 202041037547-Covering Letter [09-06-2021(online)].pdf | 2021-06-09 |
| 13 | 202041037547-CERTIFIED COPIES TRANSMISSION TO IB [09-06-2021(online)].pdf | 2021-06-09 |
| 13 | 202041037547-Abstract.jpg | 2021-10-18 |
| 14 | 202041037547-CERTIFIED COPIES TRANSMISSION TO IB [09-06-2021(online)].pdf | 2021-06-09 |
| 14 | 202041037547-Covering Letter [09-06-2021(online)].pdf | 2021-06-09 |
| 15 | 202041037547-Abstract.jpg | 2021-10-18 |
| 15 | 202041037547-Covering Letter [09-06-2021(online)].pdf | 2021-06-09 |
| 15 | 202041037547-Form 1 (Submitted on date of filing) [09-06-2021(online)].pdf | 2021-06-09 |
| 16 | 202041037547-FER.pdf | 2022-04-12 |
| 16 | 202041037547-Form 1 (Submitted on date of filing) [09-06-2021(online)].pdf | 2021-06-09 |
| 16 | 202041037547-Power of Attorney [09-06-2021(online)].pdf | 2021-06-09 |
| 17 | 202041037547-RELEVANT DOCUMENTS [08-07-2022(online)].pdf | 2022-07-08 |
| 17 | 202041037547-Request Letter-Correspondence [09-06-2021(online)].pdf | 2021-06-09 |
| 17 | 202041037547-Power of Attorney [09-06-2021(online)].pdf | 2021-06-09 |
| 18 | 202041037547-Proof of Right [07-10-2020(online)].pdf | 2020-10-07 |
| 18 | 202041037547-Request Letter-Correspondence [09-06-2021(online)].pdf | 2021-06-09 |
| 18 | 202041037547-PETITION UNDER RULE 137 [08-07-2022(online)].pdf | 2022-07-08 |
| 19 | 202041037547-COMPLETE SPECIFICATION [31-08-2020(online)].pdf | 2020-08-31 |
| 19 | 202041037547-FORM 3 [08-07-2022(online)].pdf | 2022-07-08 |
| 19 | 202041037547-Proof of Right [07-10-2020(online)].pdf | 2020-10-07 |
| 20 | 202041037547-COMPLETE SPECIFICATION [31-08-2020(online)].pdf | 2020-08-31 |
| 20 | 202041037547-DECLARATION OF INVENTORSHIP (FORM 5) [31-08-2020(online)].pdf | 2020-08-31 |
| 20 | 202041037547-FER_SER_REPLY [11-07-2022(online)].pdf | 2022-07-11 |
| 21 | 202041037547-DECLARATION OF INVENTORSHIP (FORM 5) [31-08-2020(online)].pdf | 2020-08-31 |
| 21 | 202041037547-DRAWING [11-07-2022(online)].pdf | 2022-07-11 |
| 21 | 202041037547-DRAWINGS [31-08-2020(online)].pdf | 2020-08-31 |
| 22 | 202041037547-Correspondence_Form-1 And POA_11-07-2022.pdf | 2022-07-11 |
| 22 | 202041037547-DRAWINGS [31-08-2020(online)].pdf | 2020-08-31 |
| 22 | 202041037547-FORM 1 [31-08-2020(online)].pdf | 2020-08-31 |
| 23 | 202041037547-CORRESPONDENCE [11-07-2022(online)].pdf | 2022-07-11 |
| 23 | 202041037547-FORM 1 [31-08-2020(online)].pdf | 2020-08-31 |
| 23 | 202041037547-FORM 18 [31-08-2020(online)].pdf | 2020-08-31 |
| 24 | 202041037547-COMPLETE SPECIFICATION [11-07-2022(online)].pdf | 2022-07-11 |
| 24 | 202041037547-FORM 18 [31-08-2020(online)].pdf | 2020-08-31 |
| 24 | 202041037547-POWER OF AUTHORITY [31-08-2020(online)].pdf | 2020-08-31 |
| 25 | 202041037547-REQUEST FOR EXAMINATION (FORM-18) [31-08-2020(online)].pdf | 2020-08-31 |
| 25 | 202041037547-POWER OF AUTHORITY [31-08-2020(online)].pdf | 2020-08-31 |
| 25 | 202041037547-CLAIMS [11-07-2022(online)].pdf | 2022-07-11 |
| 26 | 202041037547-US(14)-HearingNotice-(HearingDate-19-11-2024).pdf | 2024-11-05 |
| 26 | 202041037547-STATEMENT OF UNDERTAKING (FORM 3) [31-08-2020(online)].pdf | 2020-08-31 |
| 26 | 202041037547-REQUEST FOR EXAMINATION (FORM-18) [31-08-2020(online)].pdf | 2020-08-31 |
| 27 | 202041037547-STATEMENT OF UNDERTAKING (FORM 3) [31-08-2020(online)].pdf | 2020-08-31 |
| 27 | 202041037547-Correspondence to notify the Controller [11-11-2024(online)].pdf | 2024-11-11 |
| 1 | SearchStrategyE_01-04-2022.pdf |