Sign In to Follow Application
View All Documents & Correspondence

A System And Method For Implementing A Privacy Setting Providing A Defense Against Identifying A User

Abstract: Disclosed is a method and system for implementing a privacy setting providing a defense against identifying a user in real-time, wherein the defense is against a particular type of identification usually performed by an analysis of features of data entered by the user through a user interface. The system comprises a receiving module, an extraction module, and a modification module. The extraction module extracts a time related parameter with respect to one or more activities performed by the user via the user interface. The modification module is configured to modify the time using a modification technique in order to change the time related parameter to a modified time related parameter, wherein the modified time related parameter is used for performing the analysis for identifying the user, thereby proving a defense against identifying the user.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
26 June 2013
Publication Number
27/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2021-05-18
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED
NIRMAL BUILDING, 9TH FLOOR, NARIMAN POINT, MUMBAI 400021, MAHARASHTRA, INDIA

Inventors

1. SRINIVASAN, IYENGAR VENKATACHARY
TATA RESEARCH DEVELOPMENT & DESIGN CENTRE, 54/B, HADAPSAR INDUSTRIAL ESTATE, HADAPSAR, PUNE - 411013, MAHARASHTRA, INDIA
2. NAIR, VIKRAM
TATA RESEARCH DEVELOPMENT & DESIGN CENTRE, 54/B, HADAPSAR INDUSTRIAL ESTATE, HADAPSAR, PUNE - 411013, MAHARASHTRA, INDIA
3. LODHA, SACHIN
TATA RESEARCH DEVELOPMENT & DESIGN CENTRE, 54/B, HADAPSAR INDUSTRIAL ESTATE, HADAPSAR, PUNE - 411013, MAHARASHTRA, INDIA
4. VIDHANI, KUMAR MANSUKHLAL
TATA RESEARCH DEVELOPMENT & DESIGN CENTRE, 54/B, HADAPSAR INDUSTRIAL ESTATE, HADAPSAR, PUNE - 411013, MAHARASHTRA, INDIA

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
A SYSTEM AND METHOD FOR IMPLEMENTING A PRIVACY SETTING PROVIDING A DEFENSE AGAINST IDENTIFYING A USER
Applicant
TATA Consultancy Services Limited A Company Incorporated in India under The Companies Act. 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021.
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is
to be performed.

TECHNICAL FIELD
[001] The present subject matter described herein, in general, relates to systems and
methods implementing privacy settings, and more particularly to a system and method for implementing the privacy setting providing a defense against identifying a user.
BACKGROUND
[002] Keystroke dynamics is increasingly used for identification of an individual by
analyzing a keystroke pattern generated by the individual while typing on a keyboard. Alternately patterns generated through mouse clicks, gestures or touch could also be used for the same.
[003] Identification of the individual by analysis of keystroke dynamics may lead to
personal privacy breach of the individual. Many websites currently use keystroke dynamics to identify the individual in order to target them with marketing of advertisements at various instances while accessing internet. Further, various key logging softwares used by websites may track personal browsing history of the individual while analyzing the information entered by the user. Also, personal information of the individual may be accessed by such websites thereby raising a serious privacy risk.
[004] Various methods exist for securing the data entered by the individual through
any user interface, like keyboard, mouse clicks, or gestures in case of gesture recognition devices or a touch in case of devices with touch pad, in order to prevent malicious keyboard login attempts or preventing a key logger from hacking the data entered. The existing methods mask the data entered by the individual by adding random data to the data entered by the individual. However, the existing methods fail to provide for a fine tradeoff between privacy and utility. Masking the data by generating random data which is added to the data entered decreases the utility of the application to a huge extent.

SUMMARY
[005] This summary is provided to introduce aspects related to systems and methods
for implementing a privacy setting providing a defense against identifying a user and the aspects are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[006] In one implementation, a system for implementing a privacy setting providing
a defense against identifying a user in real-time is disclosed. The defense is against a particular type of identification usually performed by an analysis of features of data entered by the user through a user interface. The system comprises of a processor and a memory. The memory is coupled to the processor. The processor is capable of executing a plurality of modules stored in the memory. The plurality of modules comprise of a receiving module configured to receive the data entered through the user interface by the user and an extraction module configured to extract a time related parameter with respect to one or more activities performed by the user via the user interface. The time related parameter is usually the feature used for performing the analysis for identifying the user, and is computed by using time associated with one or more activities. The plurality of modules further comprises of a modification module configured to modify the time using a modification technique in order to change the time related parameter to a modified time related parameter, wherein the modified time related parameter is used for performing the analysis for identifying the user, thereby proving a defense against identifying the user.
[007] In one implementation, a method for implementing a privacy setting providing
a defense against identifying a user in real-time is disclosed. The defense is against a particular type of identification usually performed by an analysis of features of data entered by the user through a user interface. The method comprises of receiving the data entered through the user interface by the user and extracting a time related parameter with respect to one or more activities performed by the user via the user interface. The time related parameter is usually the feature used for performing the analysis for identifying the user, and is computed by using time associated with one or more activities. The method further comprises of modifying the time using a modification technique in order to change the time related

parameter to a modified time related parameter, wherein the modified time related parameter is used for performing the analysis for identifying the user, thereby proving a defense against identifying the user. The receiving, the extracting and the modifying are performed by a processor.
[008] In one implementation a computer program product having embodied thereon a
computer program for implementing a privacy setting providing a defense against identifying a user in real-time is disclosed. The defense is against a particular type of identification usually performed by an analysis of features of data entered by the user through a user interface. The computer program product comprises of a program code for receiving the data entered through the user interface by the user and a program code for extracting a time related parameter with respect to one or more activities performed by the user via the user interface. The time related parameter is usually the feature used for performing the analysis for identifying the user, and is computed by using time associated with one or more activities. The computer program code further comprises of a program code for modifying the time using a modification technique in order to change the time related parameter to a modified time related parameter, wherein the modified time related parameter is used for performing the analysis for identifying the user, thereby proving a defense against identifying the user.
BRIEF DESCRIPTION OF THE DRAWINGS
[009] The detailed description is described with reference to the accompanying
figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[0010] Figure 1(a) and 1(b) illustrates an implementation (on server and without
server) of a system for implementing a privacy setting providing a defense against identifying a user in real-time is shown, in accordance with an embodiment of the present subject matter.
[0011] Figure 2 illustrates the system, in accordance with an embodiment of the
present subject matter.

[0012] Figure 3 in an exemplary embodiment of the present subject matter illustrates
the features used to identify people by recording the time for one or more activities performed through the keyboard interface.
[0013) Figure 4 in an embodiment of the present subject matter illustrates the
implementation of the system using a non-server based method.
[0014] Figure 5 in an embodiment of the present subject matter illustrates the
implementation of the system using a server based method.
[0015] Figure 6 illustrates a method for implementing a privacy setting providing a
defense against identifying a user in real-time, in accordance with another embodiment of the present subject matter.
[0016] Figure 7 illustrates a pattern of analysis performed on the data entered by the
user, in accordance with an exemplary embodiment of the present subject matter.
[0017] Figure 8 illustrates a pattern of analysis performed on the data entered by the
user after uniform noise addition, in accordance with an exemplary embodiment of the present subject matter.
[0018] Figure 9 illustrates quantization techniques, in accordance with an exemplary
embodiment of the present subject matter.
DETAILED DESCRIPTION
[0019] Systems and methods for implementing a privacy setting providing a defense
against identifying a user in real-time are described. In order to implement the privacy setting, at first, the data entered by the user through a user interface is received by the system. Further, time for activities such as press-to-press, press-to-release, release-to-press and release-to-release through a keyboard, a mouse click, or gestures in case of gesture recognition devices or a touch in case of devices with touch pad, is used to extract a time related parameter. The time related parameter may be a time difference between the activities or quantized values of the time for the activities performed through the user interface.
[0020] Subsequent to the extraction of the time related parameter, the time associated
with one or more activities performed through the user interface while receiving the data, is

modified by implementing various modification techniques. The modification techniques further includes techniques, such as noise addition technique from a distribution such as Uniform, Laplace or Gaussian or quantization from a group of values in order to disguise the data entered through the user interface 204.
[0021] While aspects of described system and method for implementing a privacy
setting providing a defense against identifying a user in real-time may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
[0022] Referring now to Figure 1(a) and 1(b), an implementation 100 (on server and
without server) of a system 102 for implementing a privacy setting providing a defense against identifying a user in real-time is illustrated, in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 provides for implementing a privacy setting providing a defense against identifying a user in real-time, wherein the defense is against a particular type of identification usually performed by an analysis of features of data entered by the user through a user interface 204. The system 102 further extracts a time related parameter by using time associated with one or more activities performed through the user interface 204 while receiving the data. Further, the system 102 modifies the time associated with one or more activities performed through the user interface 204 while receiving the data by implementing a modification technique in order to change the time related parameter to a modified time related parameter.
[0023] Although, the present subject matter is explained considering that the system
102 is implemented on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. In case of non-server implementation, the instance of the system 102 will be running in the user machine itself. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2... 104-N, collectively referred to as user 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to. a portable computer, a personal digital

assistant, a handheld device, and a workstation. The user devices 104 are communicatively coupled to the system 102 through a network 106.
[0024] In one implementation, the network 106 may be a wireless network, a wired
network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0025] Referring now to Figure 2, the system 102 is illustrated in accordance with an
embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, a user interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.
[0026] The user interface 204 may include a variety of software and hardware
interfaces, for example, a web interface, a graphical user interface, and the like. The user interface 204 may allow the system 102 to interact with a user directly or through the user devices 104. Further, the user interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The user interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The user interface 204 may include one or more ports for connecting a number of devices to one another or to another server.

[0027] The memory 206 may include any computer-readable medium known in the art
including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[0028] The modules 208 include routines, programs, objects, components, data
structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include a receiving module 212, an extraction module 214, a modification module 216, and a data suggestion module 218, a feedback module 226, a selection module 228 and other modules 220. The other modules 220 may include programs or coded instructions that supplement applications and functions of the system 102.
[0029] The data 210, amongst other things, serves as a repository for storing data
processed, received, and generated by one or more of the modules 208. The data 210 may also include a system database 222, and other data 224. The other data 224 may include data generated as a result of the execution of one or more modules in the other modules 220.
[0030] In one implementation, at first, a user may use the user device 104 to access
the system 102 via the user interface 204. The user may register them using the user interface 204 in order to use the system 102. The working of the system 102 may be explained in detail in Figures 3. 4 and 5 explained below. The system 102 may be used for implementing a privacy setting providing a defense against identifying a user in real-time, wherein the defense is against a particular type of identification usually performed by an analysis of features of data (for example timing information extracted from data received) entered by the user through a user interface 204. In order to implement a privacy setting providing a defense against identifying a user in real-time, the system 102, at first, receives the data entered through the user interface 204 by the user. Specifically, in the present implementation, the data is received by the receiving module 212. The data received by the receiving module 212 may be the data generated when the input given through the user interface 204 is in the form of keyboard strokes, mouse clicks, gestures or touch.

[0031] Further, the system 102 comprises of the extraction module 214 configured to
extract time for one or more activities performed through the user interface 204 while receiving the data. The one or more activities further comprises of press-to-press activity, release-to-release activity, release-to-press activity and press-to-release activity and may relate to keyboard strokes, mouse clicks or gestures in case of gesture recognition devices or a touch in case of devices with touch pad . Further, the extraction module 214 is configured to extract a time related parameter with respect to one or more activities performed by the user via the user interface 204, wherein the time related parameter is usuaily the feature used for performing the analysis for identifying the user, and wherein the time related parameter is computed by using time associated with one or more activities.
[0032] Further, the system 102 comprises of the modification module 216 configured
to modify the time associated with activities performed through the user interface 204 while receiving the data, by using a modification technique in order to change the time related parameter to a modified time related parameter. The modification technique is selected from a group of noise addition or quantization techniques, or a combination thereof. The modified time related parameter is used for performing the analysis for identifying the user, thereby proving a defense against identifying the user.
[0033] Referring to figure 3, in an exemplary embodiment of the system 102, specific
to keyboard and mouse press or release scenario, Kdown is the activity of a key press through a keyboard and Kup is the activity of key release through the keyboard. Further, the time for the activities of key press and key release is recorded as KFT and KHT, wherein KFT is the Key Flight Time and KHT is the Key Hold Time. The time related parameter; KPP is calculated as, KPP: Key Press - Press (Di-graph), wherein KPP is the time difference between two consecutive key presses or addition of the Key Flight Time (KFT) and Key Hold Time (KHT). Further, the time related parameter (information extracted by using time associated with the activities) may be analyzed using various commonly known statistical methods, machine learning algorithms such as Naive Bayes and Support Vector Machine (SVM) algorithms, neural networks and search heuristics such as Artificial Neural Network (ANN) and Markov models in order to accurately identify the user.

[0034] In one implementation, the noise addition techniques used for modifying the
time recorded for one or more activities may include but is not limited to Gaussian noise addition, Laplace noise addition and uniform noise addition. By way of a specific example, any one of the noise addition techniques may be used to add noise of few milliseconds to the time extracted for which the key is pressed or the time for which the key is released, thereby modifying the time recorded for the activity. By way of a specific example, figure 7, figure 8 and figure 9 illustrate a histogram of keystroke pattern, wherein the keystroke corresponds to the hold time of a key. The data set consists of keystroke-timing information of 20,400 samples from 51 subjects (typists), each typing a password (.tie5Roanl) 400 times. The x-axis represents the time in seconds and the y-axis represents the number of samples. Figure 7 illustrates a pattern of analysis performed on the data entered by the user, i.e. the histogram of keystroke pattern of the hold time of the key without any modification for one specific character in the password - ".tie5Roanl". Figure 8, illustrates the pattern of analysis performed on the data entered by the user after modification, i.e. after uniform noise addition.
[0035] In another implementation, figure 9 illustrates the keystroke patterns after
modification using two quantization techniques. The techniques used for quantization includes a fixed width quantization (as shown in figure 9(a)) and fixed area quantization (as shown in figure 9(b)). In one embodiment, at the time of implementing fixed width quantization, the technique of quantization may be uniform such that the time interval between quantization values is constant. In another embodiment, implementing fixed area quantization, the technique of quantization may be such that each quantized value is represented by equal number of users. The time extracted for the activity may be modified using the quantization technique in which actual values of the time recorded for one or more activities is constrained to a smaller set of values. By way of a specific example, consider that the time recorded for one or more activities lies in a range of 0 - 600 millisecond (ms). For every activity performed, only one of the values from a set of values, for e.g. four values (for e.g. 450 ms, 350 ms, 300 ms, 150 ms) may be selected. The value of time greater than or equal to current time is selected from the range. With the above mentioned set of values, when the time for which the key pressed is 153 ms, the modification module 216 will modify it to 300 ms.

[0036] In an exemplary embodiment, consider that values the features may take
belong to the set of real numbers R. Let one such value for the feature be x0ld. Let xnew be the new value for the feature after implementing the quantization technique. In the case of implementation of the uniformly distributed quantization, the quantization rule may be as follows:

where, αmin is the least value allowed for the feature under consideration.
In the case of generic partition quantization, the quantization rule may be as follows:
Let set Q be the set of values that allowed for the features under consideration. Then we have
Q:{χ1,χ2,.......,χk}
χnew = χold if χold > Sup(Q) -
xnew = inf(x Є Q : x > xold) f χold < Sup(Q)
In the case of implementation of the technique for noise addition, the noise addition rule may be as follows:

χnew = χold+|φ(-,-,......)|
where, |φ{-,_, )| is the is the positive noise added based on some set of parameters . For
example, in case of Gaussian Noise, there may be two parameters - mean (μ) and standard deviation (σ).
[0037] The time associated with one or more activities is thus modified using the
noise addition technique or the quantization technique. The modified time related parameter may be further created using the time associated with the activities, wherein the time associated with the activities may be modified by either of the modification techniques.
[0038] Further, in one implementation, the modification technique may be
implemented as a server based method or a non-server based method or a combination

thereof. In an exemplary embodiment of the system 102, referring to figure 4, the non-server based method may be implemented as a browser plug-in or as an OS implemented service. By way of a specific example, consider that the user interface 204 is a keyboard and the data entered through the user interface 204 and received by the receiving module 212 is keystroke data.
[0039] Referring to figure 4(a), at step 402(a), the receiving module 212 receives
keystroke as the data entered by the user through the keyboard. At step, 404(a), the data in the form of keystrokes is recorded by the operating system. The extraction module 214 extracts the time associated with one or more activities performed by the user while receiving the data through the user interface 204. In the next step, 406(a), the data is forwarded to the browser, wherein, at step 408(a), through the browser helper object plug-in, the modification module 216 modifies the time related parameter by using either of the modification techniques.
[0040] The browser plug-in method works as a plug-in for browser based applications.
The browser plug-in method is an application specific method and may be implemented only for the application. The browser plug-in method provides maximum security and may be installed on a browser without rights to access the server. As the browser plug-in method is an application specific method, risk is removed by anonymity by the BHO but applications other than the browser do not have any form of privacy check and get the actual keystroke timings.
[0041] In one implementation of the system 102, referring to figure 4(b), at step
402(b), the receiving module 212 receives keystroke as the data entered by the user through the keyboard. At step, 404(b), the data in the form of keystrokes is generated by the operating system. The extraction module 214 extracts the time for the activities performed through the user interface 204 while receiving the data. In the next step, 406(b), the Windows Based service is able to capture the keystrokes received by the extraction module 214. The modification module 216 modifies the time related parameter by implementing either of the modification techniques captured by the Windows based service.
[0042] The OS implemented service method is 'a machine dependent and an
application independent method. The OS implemented service method modifies the time related parameter before the keystroke is recorded or analyzed by any other application, and thereby provides keystroke anonymity across all applications.

[0043] In an exemplary embodiment of the system 102, referring to figure 5, the
server based method can be implemented by a Protocol based JavaScript insertion or a Filter based JavaScript insertion. By way of a specific example, consider that the user interface 204 is a keyboard and the data entered through the user interface 204 and received by the receiving module 212 is keystroke data.
[0044] Referring to figure 5(a). at step 502(a), the receiving module 212 receives
keystroke as the data entered by the user through the keyboard. At step, 504(a), the data in the form of keystrokes is captured by the operating system. In the next step, 506(a), the data is given to the browser, wherein, at step 508(a), a protocol is queried through a local server before the actually called server is sent an HTTP request. The protocol based JavaScript insertion through the extraction module 214 extracts the time for the activities performed through the user interface 204. The modification module 216 modifies the time related parameter by implementing either of the modification techniques.
[0045] Referring to figure 5(b), at step 502(b), the receiving module 212 receives
keystroke as the data entered by the user through the keyboard. At step, 504(b). the data in the form of keystrokes is captured by the operating system. In the next step, 506(b), the data is forwarded to the browser, wherein, at step 508(b), the Filter based JavaScript Insertion method through the extraction module 214 extracts the time for the activities performed through the user interface 204. The modification module 216 further modifies the time and the time related parameter by implementing either of the modification techniques. The Filter based JavaScript Insertion method Warps details of the keystroke about the user for the specific application without affecting the rest of the implementation. In the next step, 510(b), the modified time related parameter is further sent to the server analyzing the data entered through the keyboard.
[0046] The filter based JavaScript Insertion method is capable of trapping HTTP
requests from machine before the request is transferred to the recording server. The trapped HTTP request comprises the keystroke timings is modified and is sent in order to prevent browser overloading.
[0047] In one implementation, the system 102 further comprises of the feedback
module 226, wherein the feedback module 226 is configured to perform analysis of the time

extracted for one or more activities for one or more users, the results of the analysis is further provided as a feedback to the user. In one embodiment, the user may be shown the average of the flight hold time or press-to-press times for different keystrokes.
[0048] In one implementation, the system 102 further comprises of the selection
module 228 configured to allow the user to select the modification technique, parameters for implementing the modification technique selected and the method for implementing the modification technique i.e. server based method or non-server based method. The parameters may include but is not limited to number of buckets for quantization or quantization intervals, and a set of parameters used to describe a distribution, added while implementing techniques for noise addition. For example, the parameters while uniform noise addition may include a range that includes minimum and maximum values as parameters and for Gaussian Noise, the parameters may include mean and standard deviation.
[0049] In one implementation, the system 102 further comprises of a data suggestion
module 218, which suggests the data entered through the user interface 204 by implementing a technique. The technique may include but is not limited to an auto-suggestion technique using an exhaustive dictionary By way of a specific example, in case of keystrokes, consider that the user has typed - "Mahe". The phrase may be completed with possible options of, "Mahesh", "Mahendra", and "Mahesh Babu", "Mahesh Bhatt", "Mahesh tutorials" and "Mahendra Singh Dhoni" being the popular options which are auto-suggested by accessing the dictionary stored in the system database 222. Therefore, instead of accessing the dictionary provided by the server of a search engine, the data suggestion module 218 uses a dictionary stored in the system database 222 to suggest options with respect to the data entered through the user interface 204. The data suggestion module 218 may be implemented using both non-server and server based methods.
Advantages
[0050] The system 102 provides a fine tradeoff between providing privacy without
major degradation in utility.
[0051] The system 102 provides a defense against keystroke dynamics based
identification through modification of keystroke features.

[0052] The system 102 provides a high degree of protection to new users visiting a
website implementing the system 102, as the pattern, generated for identification of the user by the website, is the pattern generated after noise addition or after application of a quantization technique.
[0053] The system 102 also provides protection to the users regularly visiting the
website implementing the system 102, as application of any quantization technique will group the user into a group of many users having similar keystroke pattern. Also, the noise addition to the pattern of analysis will group the user into a different group of users. The effect of quantization technique on various algorithms used for identification of the user is that, fewer groups may be formed with a large number of users in each group. The effect of noise addition is that, the pattern generated for every user will show a large variation in the time recorded, thereby placing the users into a different or a wrong group.
[0054] Referring now to Figure 6, a method 600 for implementing a privacy setting to
provide defense against identification of a user is shown, in accordance with an embodiment of the present subject matter. The method 600 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 600 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0055] The order in which the method 600 is described is not intended to be construed
as a limitation, and any number of the described method blocks can be combined in any order to implement the method 600 or alternate methods. Additionally, individual blocks may be deleted from the method 600 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 600 can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the

embodiments described below, the method 600 may be considered to be implemented in the above described system 102.
[0056] At block 602. the data entered through the user interface 204 by the user is
received. In one implementation, the data entered through the user interface 204 by the user may be received by the receiving module 212.
[0057] At block 604. a time related parameter with respect to one or more activities
performed by the user through the user interface is extracted. In one implementation, time related parameter with respect to one or more activities may be extracted by the extraction module 214.
[0058] At block 606, the time associated with one or more activities performed
through the user interface while receiving the data is modified, by implementing a modification technique to change the time related parameter to a modified time related parameter. In one implementation, the time associated with one or more activities is modified by the modification module 216.
[0059] Traditionally, IP address monitoring and 3rd party HTTP Cookies have been
used to serve this purpose. With increased use of IP address anonymization services and strict cookie management policies, these methods can be controlled with protective mechanisms. Moreover, these mechanisms do not affect the functionality offered at any website. On the contrary, stopping keystroke dynamics could reduce the utility offered by a site. We show this using the following scenarios -
[0060] Search utility - Based on past search history, search engines are able to smartly
suggest options that completes the query typed by a user. This saves valuable few seconds for the user and also reduces his or her typing effort. This is a very useful feature whose utility will be drastically affected if we add inappropriate noise or quantization.
[0061] Form filling - While filling forms users might have to mention some details
such as Date of birth, City etc. that can act filter people. When combined with keystroke data this to uniquely identify people, thus raising a non-trivial privacy challenge that has to be overcome.

[0062] Although implementations for methods and systems implementing a privacy
setting against identification of a user have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for implementing a privacy setting providing a defense against identifying a user in real-time.

WE CLAIM:
1. A method for implementing a privacy setting providing a defense against identifying a
user in real-time, wherein the defense is against a particular type of identification usually
performed by an analysis of features of data entered by the user through a user interface, the
method comprising:
receiving the data entered through the user interface by the user;
extracting a time related parameter with respect to one or more activities performed by the user via the user interface, wherein the time related parameter is usually the feature used for performing the analysis for identifying the user, and wherein the time related parameter is computed by using time associated with one or more activities; and
modifying the time using a modification technique in order to change the time related parameter to a modified time related parameter, wherein the modified time related parameter is used for performing the analysis for identifying the user, thereby proving a defense against identifying the user;
wherein the receiving, the extracting, and the modifying are performed by a processor using programmed instructions stored in a memory.
2. The method of claim 1, wherein one or more activities further comprises of a keyboard stroke, a mouse click, a touch, a gesture or a combination thereof.
3. The method of claim 2, wherein one or more activities related to the keyboard stroke and the mouse click further comprises of press-to-press activity, release-to-release activity, release-to-press activity and press-to-release activity.
4. The method of claim 1, wherein the modification technique is selected from a group of noise addition techniques and quantization technique of a group of values, or a combination thereof.
5. The method of claim 4, wherein the noise addition techniques include Laplace, Uniform and Gaussian techniques.
6. The method of claim 4, wherein the quantization techniques includes fixed area quantization and fixed width quantization.

7. The method of claim 1, wherein the modification technique may be implemented as a server based method or a non-server based method or a combination thereof, the server based method may be implemented by a protocol based javascript insertion or a filter based javascript insertion and the non-server based method may be implemented as a browser plug-in or as a OS implemented service.
8. The method of claim 1, wherein the parameters consist of, number of buckets for quantization or quantization intervals, and a set of parameters used to describe a distribution, to be added while implementing noise addition techniques.
9. A system for implementing a privacy setting providing a defense against identifying a user in real-time, wherein the defense is against a particular type of identification usually performed by an analysis of features of data entered by the user through a user interface, the system comprising:
a processor; and
a memory coupled to the processor, wherein the processor is capable of executing a plurality of modules stored in the memory, and wherein the plurality of module comprising:
a receiving module configured to receive the data entered through the user interface by the user;
an extraction module configured to extract a time related parameter with respect to one or more activities performed by the user via the user interface, wherein the time related parameter is usually the feature used for performing the analysis for identifying the user, and wherein the time related parameter is computed by using time associated with one or more activities; and
a modification module configured to modify the time using a modification technique in order to change the time related parameter to a modified time related parameter, wherein the modified time related parameter is used for performing the analysis for identifying the user, thereby proving a defense against identifying the user.
10. The system of claim 9. wherein one or more activities further comprises of a keyboard
stroke, a mouse click, a touch, a gesture or a combination thereof.

11. The system of claim 10, wherein one or more activities related to the keyboard stroke and the mouse click further comprises of press-to-press activity, release-to-release activity, release-to-press activity and press-to-release activity.
12. The system of claim 9, wherein the modification technique is selected from a group of noise addition techniques and quantization techniques of a group of values, or a combination thereof.
13. The system of claim 12. wherein the noise addition techniques are Laplace, Uniform and Gaussian techniques.
14. The system of claim 12, wherein the quantization techniques includes fixed area quantization and fixed width quantization.
15. The system of claim 9, wherein the system further comprises of a data suggestion module, the data suggestion module suggests modification in the data entered through the user interface by implementing an auto-suggestion technique using a dictionary, wherein the user interface may be a keyboard or a mouse or a touch pad and the data received through the user interface is in the form of the keyboard strokes, mouse clicks or touch inputs or gestures.
16. The system of claim 9, wherein the modification technique may be implemented as a server based method or a non-server based method, the server based method can be implemented by a protocol based javascript insertion or a filter based javascript insertion and the non-server based method can be implemented as a browser plug-in or as a OS implemented service.
17. The system of claim 9, wherein the parameters comprise number of buckets for quantization or bucketization or quantization intervals, and a set of parameters used to describe a distribution, to be added while implementing noise addition techniques.
18. A computer program product having embodied thereon a computer program for implementing a privacy setting providing a defense against identifying a user in real-time, wherein the defense is against a particular type of identification usually performed by an analysis of features of data entered by the user through a user interface, the computer program product comprising:
a program code for receiving the data entered through the user interface by the user;

a program code for extracting a time related parameter with respect to one or more activities performed by the user via the user interface, wherein the time related parameter is usually the feature used for performing the analysis for identifying the user, and wherein the time related parameter is computed by using time associated with one or more activities; and
a program code for modifying the time using a modification technique in order to change the time related parameter to a modified time related parameter, wherein the modified time related parameter is used for performing the analysis for identifying the user, thereby proving a defense against identifying the user.

Documents

Application Documents

# Name Date
1 2169-MUM-2013-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30
1 ABSTRACT.jpg 2018-08-11
2 2169-MUM-2013-FORM 3.pdf 2018-08-11
2 2169-MUM-2013-US(14)-HearingNotice-(HearingDate-16-04-2021).pdf 2021-10-03
3 2169-MUM-2013-IntimationOfGrant18-05-2021.pdf 2021-05-18
3 2169-MUM-2013-FORM 26(6-9-2013).pdf 2018-08-11
4 2169-MUM-2013-PatentCertificate18-05-2021.pdf 2021-05-18
4 2169-MUM-2013-FORM 2.pdf 2018-08-11
5 2169-MUM-2013-Written submissions and relevant documents [30-04-2021(online)].pdf 2021-04-30
5 2169-MUM-2013-FORM 2(TITLE PAGE).pdf 2018-08-11
6 2169-MUM-2013-Response to office action [12-04-2021(online)].pdf 2021-04-12
6 2169-MUM-2013-FORM 18.pdf 2018-08-11
7 2169-MUM-2013-FORM 1.pdf 2018-08-11
7 2169-MUM-2013-Correspondence to notify the Controller [08-04-2021(online)].pdf 2021-04-08
8 2169-MUM-2013-FORM-26 [08-04-2021(online)].pdf 2021-04-08
8 2169-MUM-2013-FORM 1(16-7-2013).pdf 2018-08-11
9 2169-MUM-2013-ABSTRACT [21-11-2019(online)].pdf 2019-11-21
9 2169-MUM-2013-DRAWING.pdf 2018-08-11
10 2169-MUM-2013-CLAIMS [21-11-2019(online)].pdf 2019-11-21
10 2169-MUM-2013-DESCRIPTION(COMPLETE).pdf 2018-08-11
11 2169-MUM-2013-COMPLETE SPECIFICATION [21-11-2019(online)].pdf 2019-11-21
11 2169-MUM-2013-CORRESPONDENCE.pdf 2018-08-11
12 2169-MUM-2013-CORRESPONDENCE(6-9-2013).pdf 2018-08-11
12 2169-MUM-2013-FER_SER_REPLY [21-11-2019(online)].pdf 2019-11-21
13 2169-MUM-2013-CORRESPONDENCE(16-7-2013).pdf 2018-08-11
13 2169-MUM-2013-OTHERS [21-11-2019(online)].pdf 2019-11-21
14 2169-MUM-2013-CLAIMS.pdf 2018-08-11
14 2169-MUM-2013-FER.pdf 2019-05-22
15 2169-MUM-2013-ABSTRACT.pdf 2018-08-11
16 2169-MUM-2013-CLAIMS.pdf 2018-08-11
16 2169-MUM-2013-FER.pdf 2019-05-22
17 2169-MUM-2013-OTHERS [21-11-2019(online)].pdf 2019-11-21
17 2169-MUM-2013-CORRESPONDENCE(16-7-2013).pdf 2018-08-11
18 2169-MUM-2013-FER_SER_REPLY [21-11-2019(online)].pdf 2019-11-21
18 2169-MUM-2013-CORRESPONDENCE(6-9-2013).pdf 2018-08-11
19 2169-MUM-2013-COMPLETE SPECIFICATION [21-11-2019(online)].pdf 2019-11-21
19 2169-MUM-2013-CORRESPONDENCE.pdf 2018-08-11
20 2169-MUM-2013-CLAIMS [21-11-2019(online)].pdf 2019-11-21
20 2169-MUM-2013-DESCRIPTION(COMPLETE).pdf 2018-08-11
21 2169-MUM-2013-ABSTRACT [21-11-2019(online)].pdf 2019-11-21
21 2169-MUM-2013-DRAWING.pdf 2018-08-11
22 2169-MUM-2013-FORM 1(16-7-2013).pdf 2018-08-11
22 2169-MUM-2013-FORM-26 [08-04-2021(online)].pdf 2021-04-08
23 2169-MUM-2013-Correspondence to notify the Controller [08-04-2021(online)].pdf 2021-04-08
23 2169-MUM-2013-FORM 1.pdf 2018-08-11
24 2169-MUM-2013-FORM 18.pdf 2018-08-11
24 2169-MUM-2013-Response to office action [12-04-2021(online)].pdf 2021-04-12
25 2169-MUM-2013-Written submissions and relevant documents [30-04-2021(online)].pdf 2021-04-30
25 2169-MUM-2013-FORM 2(TITLE PAGE).pdf 2018-08-11
26 2169-MUM-2013-PatentCertificate18-05-2021.pdf 2021-05-18
26 2169-MUM-2013-FORM 2.pdf 2018-08-11
27 2169-MUM-2013-IntimationOfGrant18-05-2021.pdf 2021-05-18
27 2169-MUM-2013-FORM 26(6-9-2013).pdf 2018-08-11
28 2169-MUM-2013-US(14)-HearingNotice-(HearingDate-16-04-2021).pdf 2021-10-03
28 2169-MUM-2013-FORM 3.pdf 2018-08-11
29 ABSTRACT.jpg 2018-08-11
29 2169-MUM-2013-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30

Search Strategy

1 2019-05-1712-00-03_17-05-2019.pdf

ERegister / Renewals

3rd: 26 Jun 2021

From 26/06/2015 - To 26/06/2016

4th: 26 Jun 2021

From 26/06/2016 - To 26/06/2017

5th: 26 Jun 2021

From 26/06/2017 - To 26/06/2018

6th: 26 Jun 2021

From 26/06/2018 - To 26/06/2019

7th: 26 Jun 2021

From 26/06/2019 - To 26/06/2020

8th: 26 Jun 2021

From 26/06/2020 - To 26/06/2021

9th: 26 Jun 2021

From 26/06/2021 - To 26/06/2022

10th: 15 Jun 2022

From 26/06/2022 - To 26/06/2023

11th: 24 Jun 2023

From 26/06/2023 - To 26/06/2024

12th: 25 Jun 2024

From 26/06/2024 - To 26/06/2025

13th: 19 Jun 2025

From 26/06/2025 - To 26/06/2026