Abstract: The present invention relates to a system (200) and a method (300) for determining a relationship type between users. The system (200) may include various units such as a monitoring unit (216) for monitoring one or more interactions between the users (112); a processing unit (218) for extract one or more keywords and/or strings from the one or more interactions; and a comparing unit (220) for compare each of the extracted keywords 10 and/or strings with a plurality of clusters. The system (200) may further include a determining unit (222) for determining the relationship type between the users (112) by identifying a cluster from the plurality of clusters based on highest similarity, where each of the plurality of clusters corresponds to a particular relationship type.
TECHNICAL FIELD
[0001] The present disclosure generally relates to network communication. More
specifically, the present disclosure relates to a method and a system for determining
relationship types between users.
5
BACKGROUND
[0002] Messaging services/applications allow users to communicate without being
physically present at the same location. The messaging services allow the users to
communicate via a number of communication mechanisms such as telephony, email, text
10 or SMS messaging, and instant messaging. These messaging services ease the
communication between the users. However, the nature of relationship between two users
communicating with each other cannot be reliably described in the existing messaging
services.
15 [0003] The messaging services/applications provide users with various emojis, stickers,
animations, and audio-visual media to allow the users to express their emotions, feelings,
and sentiments. It is very important to provide relevant emojis, stickers, animations, and
audio-visual media to use in the messaging applications according to the nature
relationships between the users. Also, nature of relationship may be dynamic and may
20 vary according to various scenarios on day-to-day basis.
[0004] Hence, it would be advantageous to identify the nature of the relationships
between users so as to enhance the communication experience of the users. However, the
conventional technologies do not provide the techniques that efficiently and accurately to
25 dynamically determine the relationships between the users. In view of the foregoing, there
exists a need in the art to provide a solution which overcomes the above-mentioned
problems.
SUMMARY
30 [0005] Exemplary aspects are directed to system and method for efficiently and
accurately determining a relationship type between users. In an exemplary embodiment,
the system and method dynamically enhance the communication experience of the users
based on the nature of the relationships between the users. In an exemplary embodiment,
based on the type of relationship between the users, the system and method provide
35 personalized medias such as, but not limited to, emojis, stickers, animations, and audio-
3
visual media to use in messaging which allow the users to express their emotions, feelings,
and sentiments more efficiently, realistically, and accurately.
[0006] In an exemplary aspect, the present disclosure is directed to a method of
5 determining a relationship type between users. The method comprises monitoring one or
more interactions between the users and extracting one or more keywords and/or strings
from the one or more interactions. The method further comprises comparing each of the
extracted keywords and/or strings with a plurality of clusters and determining the
relationship type between the users by identifying a cluster from the plurality of clusters
10 based on highest similarity, where each of the plurality of clusters corresponds to a
particular relationship type.
[0007] In another aspect, the present disclosure describes a system for determining a
relationship type between users. The system comprises at least a monitoring unit, a
15 processing unit, a comparing unit, and a determining unit. The monitoring unit is
configured to monitor one or more interactions between the users. The processing unit is
configured to extract one or more keywords and/or strings from the one or more
interactions. The comparing unit is configured to compare each of the extracted keywords
and/or strings with a plurality of clusters and the determining unit is configured to
20 determine the relationship type between the users by identifying a cluster from the
plurality of clusters based on highest similarity. Each of the plurality of clusters
corresponds to a particular relationship type.
[0008] The foregoing summary is illustrative only and is not intended to be in any way
25 limiting. In addition to the illustrative aspects, embodiments, and features described
above, further aspects, embodiments, and features will become apparent by reference to
the drawings and the following detailed description.
OBJECTIVES OF THE INVENTION
30
[0009] An object of present invention is to provide a system and method for determining
a nature or a type of a relationship between users.
[0010] Another object of present invention is to enhance the communication experience
35 of the users based on the nature of the relationship between them.
4
[0011] Yet another object of the present invention is to provide personalized emojis,
stickers, animations, and audio-visual media based on determined relationship type to
allow the users to express their emotions, feelings, and sentiments more efficiently and
5 accurately.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The accompanying drawings, which are incorporated in and constitute a part of
10 this disclosure, illustrate exemplary embodiments and, together with the description,
serve to explain the disclosed embodiments. In the figures, the left-most digit(s) of a
reference number identifies the figure in which the reference number first appears. The
same numbers are used throughout the figures to reference like features and components.
Some embodiments of system and/or methods in accordance with embodiments of the
15 present subject matter are now described, by way of example only, and with reference to
the accompanying figures, in which:
[0013] Figure 1 illustrates an exemplary environment for determining relationship types
between users, according to an aspect of the present invention.
20 [0014] Figure 2 illustrates a block diagram illustrating a system for determining
relationship types between the users, according to an aspect of the present invention.
[0015] Figure 3 depicts flowchart of a method for determining a relationship type
between users, according to an aspect of the present invention.
25
[0016] Figure 4 illustrates a block diagram of an exemplary computer system for
implementing embodiments consistent with the present disclosure.
[0017] It should be appreciated by those skilled in the art that any block diagrams herein
30 represent conceptual views of illustrative systems embodying the principles of the present
subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state
transition diagrams, pseudo code, and the like represent various processes which may be
substantially represented in computer readable medium and executed by a computer or
processor, whether or not such computer or processor is explicitly shown.
35
DETAILED DESCRIPTION
5
[0018] In the present document, the word “exemplary” is used herein to mean “serving
as an example, instance, or illustration.” Any embodiment or implementation of the
present subject-matter described herein as “exemplary” is not necessarily to be construed
as preferred or advantageous over other embodiments.
5
[0019] While the disclosure is susceptible to various modifications and alternative forms,
specific embodiment thereof has been shown by way of example in the drawings and will
be described in detail below. It should be understood, however that it is not intended to
limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is
10 to cover all modifications, equivalents, and alternatives falling within the scope of the
disclosure.
[0020] The terms “comprise(s)”, “comprising”, “include(s)”, or any other variations
thereof, are intended to cover a non-exclusive inclusion, such that a setup, system, or
15 method that comprises a list of components or steps does not include only those
components or steps but may include other components or steps not expressly listed or
inherent to such setup or system or method. In other words, one or more elements in a
system or apparatus proceeded by “comprises… a” does not, without more constraints,
preclude the existence of other elements or additional elementsin the system or apparatus.
20
[0021] In the following detailed description of the embodiments of the disclosure,
reference is made to the accompanying drawings that form a part hereof, and in which are
shown by way of illustration specific embodiments in which the disclosure may be
practiced. These embodiments are described in sufficient detail to enable those skilled in
25 the art to practice the disclosure, and it is to be understood that other embodiments may
be utilized and that changes may be made without departing from the scope of the present
disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0022] The present invention will be described herein below with reference to the
30 accompanying drawings. In the following description, well known functions or
constructions are not described in detail since they would obscure the description with
unnecessary detail.
6
[0023] The present invention relates to system and method for determining relationship
types between two users. The system may include a web-based or cloud-based platform
that accesses user devices over a network. The system and method in the present
disclosure enhance communication experience of the users based on the nature of the
5 relationships between the users. In another embodiment, the system and method in the
present disclosure provide personalized medias such as, but not limited to, emojis,
stickers, animations, and audio-visual media to use in messaging which allow the users to
express their emotions, feelings, and sentiments more efficiently, realistically, and
accurately.
10
[0024] The present disclosure addresses the above-identified problems and provides an
improved technique of enhancing user experience. Particularly, the present disclosure
provides an improved technique for accurately and efficiently determining a relationship
type between users based on their chat history.
15
[0025] Referring to figure 1, an exemplary communication environment 100 is disclosed
for determining relationship types between users based on communications exchanged
between them. The communication environment 100 comprises a plurality of user devices
102-1, 102-2, … 102-n associated with a plurality of users 112-1, 112-2, … 112-n. The
20 plurality of user devices may be individually or collectively represented by reference
numeral 102 and the plurality of users may be individually or collectively represented by
reference numeral 112. The communication environment 100 may further comprise a
server 104 communicably coupled with the user devices 102 via a communication
network 106. The server 104 may comprise one or more processors 108 and one or more
25 memory 110 communicably coupled to each other to implement one or more
functionalities of the server 104. The user devices 102 may represent desktop computers,
laptop computers, mobile devices (e.g., Smart phones or personal digital assistants), tablet
devices, or other type of computing devices, which have computing and networking
capabilities. A user device 102 may be equipped with one or more computer storage
30 devices (e.g., RAM, ROM, PROM, SRAM, etc.), communication unit, and one or more
processing devices (e.g., central processing units) that are capable of executing computer
program instructions.
7
[0026] According to an exemplary embodiment, the communication between the users
may be in the form exchange of one or more of the following, but not limited to, text,
audio, video, emoji, stickers, animations, images, and audio-visual media etc. A single
user may be associated with one or more user devices. Alternatively, a single device 102
5 may be associated with a plurality of users. The communication network 106 may include
one or more of, but not limited to, Internet, a local area network, a wide area network, an
intranet, a peer-to-peer network, and/or other similar technologies for connecting various
entities as discussed below.
10 [0027] According to an exemplary embodiment of the present disclosure, the memory
110 may store various types of information. The information stored in the memory 110
may represent historical communications such as chat logs of at least one user and may
include following information parameters, but not limited to: one or more types of the
relationships, one or more keywords and strings, one or more sets of interactions, and
15 similarity scores assigned to the one or more sets of interactions. Each set of interactions
may comprise a number of messages exchanged between the users. The one or more types
of relationships may be any of, but not limited to, romantic, friendship, family
relationship, professional or formal relationship, private, or any other social circles (such
as neighbours, etc) etc.
20
[0028] Turning now to figure 2 which shows a block diagram illustrating a system 200
for determining a relationship type between the users. The system may be a part of the
server 104 illustrated in figure 1. According to an embodiment of the present disclosure,
the system 200 may comprise input/output interface 202, a memory 204, and various units
25 206. The I/O interface 202 may include a variety of software and hardware interfaces, for
example, a web interface, a graphical user interface, input device, output device and the
like. The I/O interface 202 may allow the system 200 to interact with the user devices 102
directly or through other devices. The memory 204 is communicatively coupled to the
various units 206. Further, the memory 204 comprises various data such as interaction
30 data 208, one or more datasets 210 comprising one or more interactions, a plurality of
clusters 214, and a plurality of keywords and/or strings 212. The interaction data 208 may
include, but not limited to, chat logs of the users, emojis, voice, text, image, or video
communications with other users. The datasets 210 may comprise a plurality of sets of
data, each set comprising a predefined number of interactions. Keywords and strings 212
35 may comprise a plurality of keywords and strings extracted from the interactions. Each of
8
the plurality of clusters 214 may be associated with a relationship type and may comprise
information in association with the relationship type. The information may comprise, but
not limited to, one or more interactions; keywords and strings extracted from the one or
more interactions; a similarity matrix corresponding to the one or more interactions;
5 interactions, keywords, and/or strings which are contextually similar with the extracted
keywords and strings, synonyms of the extracted keywords and strings etc.
[0029] Further, the units 206 may comprise a monitoring unit 216, a processing unit 218,
a comparing unit 220, a determining unit 222, and other units 224. Further, the units 216-
10 224 may be dedicated hardware units capable of performing various operations of the
system 200. However, according to other embodiments, the units 216-224 may be a
processor (such as the processor 108) or an application-specific integrated circuit (ASIC)
or any circuitry capable of executing instructions stored in the memory 204 of the system
200. For the sake of illustration, it is shown here that the user devices 102 are external to
15 the system 200. However, in an alternative embodiment the user devices 102 may be a
part of the system 200.
[0030] The system 200 may receive, via the I/O interface 202, data from the one or more
user devices 102 and may store the received data in the memory 204. In one embodiment,
20 the system 200 may request data from the user devices 102 and store the requested data
in the memory 204. The data may be stored based on association information indicating
an association of the stored data with a corresponding user or a user device. The
association information may comprise one or more of, but not limited to, phone number,
device ID, email ID, username, and authentication credentials etc. The system 200 may
25 use machine learning and neural network processing to generate/form a plurality of
clusters using the data stored in the memory. The system 200 may then use the formed
clusters to determine a relationship type between two users.
[0031] Initially the system 200 receives interaction data 208 corresponding to a plurality
30 of users 102 and stores the interaction data in the memory 204. The interaction data may
comprise text messages, emojis, audios, videos, audio-visual contents etc. exchanged
between the users 102 along with the time of interaction i.e., the interaction data is time
stamped. The processing unit 218 may generate a plurality of interactions by processing
the interaction data 208 stored in the memory 204. In the context of present disclosure, an
9
interaction may be considered as a collection of continuous messages sent from one user
to another user. Consider an example, where two users A and B are communicating with
each other and during a communication session a total of 14 messages are exchanged
between them. Suppose initially user A sends 3 messages (a, b, c) to user B, then user A
5 receives 4 messages (d, e, f, g) from user B. Thereafter, user A sends 2 messages (h, i) and
receives 5 messages (j, k, l, m, n) from user B. The processing unit 218 may combine the
messages a, b, c to form one interaction. Similarly, the processing unit may combine
messages d, e, f, g to form another interaction. This way the processing unit 218 may
generate a plurality of interactions from the interaction data 208 of the plurality of users
10 112.
[0032] The processing unit 218 may split the generated interactions into one or more
datasets 210. The splitting may be performed in such a way that each dataset may
comprise a predefined number (N) of interactions. The value of N may be predefined by
15 a user. For instance, in above-described example there are total of four interactions (I1,
I2, I3, and I4). Consider that the value of N is 2. The processing unit 218 may divide these
four interactions into 2 datasets, where a first dataset may comprise first two interactions
(i.e., I1 and I2) and the second dataset may comprise remaining two interactions (i.e., I3
and I4).
20
[0033] In a non-limiting embodiment of the present invention, before generating the
plurality of interactions, the processing unit 218 may perform pre-processing on the
interaction data 208. Since the interaction data is in natural form, which is in the format
of text, sentences, media, paragraphs etc. Before such input can be passed to the system
25 200, the data needs some clean-up or pre-processing so that the system 200 may focus on
important words/features instead of words/features which add minimal or no value. The
pre-processing may include one or more of removing numbers, removing special
characters, expanding contractions, removing punctuations, stemming, lemmatization,
removing stop words, removing extra white spaces etc., but not limited thereto. The pre30 processing improves accuracy of the relationship determination technique. The preprocessed data may be stored in the memory 204.
[0034] In a non-limiting embodiment of the present invention, the processing unit 218
may perform clustering on the generated datasets 210 to generate a plurality of clusters
10
214. The clustering is the process of grouping similar objects into similar groups called
clusters. The processing unit 218 may use neural networks and machine learning to form
the clusters.
5 [0035] In an embodiment of the present invention, the processing unit 218 may extract
key features from the generated datasets 210. The feature extraction may comprise
extracting relevant information from the datasets such as, but not limited to, keywords
and strings. The keywords and strings may be stored in the memory 204. The processing
unit 218 may compare the generated datasets with each other using the extracted features
10 and may generate a multi-dimensional value (or a similarity matrix) for each dataset. The
similarity matrix indicates the degree of similarity among various interactions, keywords,
and/or strings. The processing unit 218 may then group similar objects (interactions,
keywords, strings) into similar clusters based on the degree of similarity. In a non-limiting
embodiment, the similarity matrix may be generated for each interaction of the dataset.
15
[0036] Thereafter, each of the plurality of clusters 214 may be annotated into human
understandable language. For example, the plurality of clusters may be annotated by a
type of relationship such as romantic, friendship, formal etc., but not limited thereto. Each
cluster comprises information such as, but not limited to, a nature or a type of relationship,
20 one or more interactions corresponding to the relationship type, one or more keywords
and/or strings, similarity matrices etc. The plurality of clusters may be stored in the
memory 204.
[0037] In one non-limiting embodiment of the present invention, the processing unit 218
25 may continuously monitor the interactions among the plurality of users 102 and may
continuously update or add the clusters in real-time based on the monitored interactions.
[0038] Consider a first example where a user is having romantic conversation with
another user, the processing unit 218 may store the conversation or interaction data in the
30 memory 204 and may process the interaction data to form a plurality of interactions and
a plurality of datasets. The extracted keywords and/or strings may comprise one or more
of, but not limited to, “love you”, “miss you”, “kiss”, “lovely” etc.
[0039] Considering a second example where a user is having heated/aggressive
35 conversation with another user. The processing unit 218 may store the conversation/
11
interaction data in the memory 204 and may process the interaction data to form a plurality
of interactions and a plurality of datasets. The extracted keywords and/or strings may
comprise one or more of, but not limited to, “do not”, “dare”, “excuse me”, “blown”,
“hate”, etc.
5
[0040] Considering a third example of a user having causal conversation with another
user. The processing unit 218 may store the conversation/interaction data in the memory
204 and may process the interaction data to form a plurality of interactions and a plurality
of datasets. The extracted keywords and/or strings may comprise one or more of, but not
10 limited to, “how do you do” and “nice to meet you” “hey, what’s up” and “how is it
going”, etc.
[0041] In the exemplary embodiments, the processing unit 218 may process the data of
all three examples using neural networks and machine learning to form three clusters i.e.,
15 for romantic relationship, enemy/angry relationship, and casual relationship, respectively.
One or more information parameters may be stored in the memory 204 in association with
each cluster. The information parameters may include, but not limited to, a type of
relationship, one or more sets of interactions, keywords and/or strings, and similarity
matrices assigned to the interactions, keywords, and/or strings.
20
[0042] In this manner, different information parameters may be stored in the memory
204 in association with the type of relationship depending on the different types of
scenarios as described above. It may be worth noting here that the relationship type
between the users may change with time. Thus, to capture the type of relationship more
25 accurately, the system may be trained based on limited chat history i.e., instead of using
entire chat history, latest M interactions of each user may be selected for cluster formation,
where M is a predefined number.
[0043] Once the clusters are formed, any new set of interactions between any two users
30 may be passed to the system 200 to determine the type of relationship between the two
users. The technique to determine a relationship type is described below using the system
200.
12
[0044] In an embodiment, the monitoring unit 216 monitors one or more interactions
between a plurality of users. In a preferred embodiment, the monitoring unit 216 monitors
one or more interactions between any two desired users. The interactions may comprise
one or more of the following, but not limited to, text, audio, video, emoji, stickers,
5 animations, images, and audio-visual media etc.
[0045] The processing unit 218 may extract key features from the one or more
interactions. The feature extraction may comprise extracting relevant information from
the interactions such as, but not limited to, keywords and/or strings. In one embodiment,
10 the one or interactions may be received from a user. In another embodiment, the
monitoring unit 216 and/or the processing unit 218 may select the one or more interactions
based on one or more parameters. The one or more parameters may be specified by the
user and may comprise, but not limited to, a number of interactions, a type of interaction,
a time frame, and a size of interaction. For example, the user may specify one or more of:
15 a number of interactions to select for determining the relationship type; a type of
interactions such as text, emojis, audio content, video content etc. to select for determining
the relationship type; a time frame and the message exchanged during that time
frame/duration may be selected for determining the relationship type; a total size of
interactions or a maximum/minimum size of each interaction etc.
20
[0046] The comparing unit 220 may compare each of the extracted features (i.e.,
keywords and/or strings) with a plurality of clusters, where the plurality of clusters is
formed based on previous interactions and each of the plurality of clusters corresponds to
a particular relationship type. In another embodiment, the plurality of clusters may be
25 formed using pre-fed keywords and/or strings and may be continuously updated based on
interactions among the users.
[0047] The determining unit 222 may determine the relationship type between the users
by identifying a cluster from the plurality of clusters based on highest similarity between
30 the plurality of clusters and the extracted keywords and/or strings. In one embodiment,
the determining unit 222 may determine a similarity matrix based on the comparison of
the extracted keywords and/or strings with the plurality of clusters 214. As described
earlier, each cluster corresponds to a particular relationship type. The determining unit
222 may then determine the relationship type between the users by identifying a cluster
35 corresponding to the highest value in the similarity matrix.
13
[0048] In this way the system 200 determines a relationship type between the users. The
system 200 may use the identified relationship type to enhance the communication
experience of the users.
5
[0049] In another non-limiting aspect of the present invention, before extracting the
features, the processing unit 218 may perform pre-processing on the monitored
interactions to improve accuracy of the relationship determining technique.
10 [0050] In another non-limiting aspect of the present invention, the processing unit 218
may form one or more groups from the extracted keywords and/or strings based on
predefined factors. The predefined factors may include synonyms, context, tone, and
expressions, but not limited thereto. For example, the processing unit 218 may include
similar keywords into one group and may additionally include synonyms of the keywords
15 in the group. The processing unit 218 may process the monitored interactions to identify
the context of conversation between the users, expressions of the users during the
conversation, and tones of the users during the conversation. The processing unit 218 may
process the texts, emojis, stickers, animations, and audio-visual media for identifying the
context, expressions, and tones. The processing unit 218 may include additional keywords
20 and/or strings and their synonyms in the groups based on the identified context,
expressions, and tones. This embodiment may be useful when it is difficult to determine
the relationship type merely based on the keywords and/or strings extracted from the
interactions or when it is desired to increase the accuracy of the relationship type
determining technique. In such scenario, the processing unit 218 may determine the
25 relationship type by analysing the context of conversation, expressions, and tones of users
during the conversation.
[0051] In another non-limiting aspect of present invention, the processing unit 218 may
rank the formed groups based on predetermined conditions. The predetermined conditions
30 may comprise a frequency of occurrence of keywords and/or strings, a weight assigned
to the keywords and/or strings, a total number of keywords/strings in a group, size of
group, number of media items present in the group etc., but not limited thereto. For
example, the processing unit 218 may give high priority to a group having multiple
occurrences of keywords/strings over a group having single occurrences of keywords
35 and/or strings; the processing unit 218 may give high priority to a group having higher
14
number of keywords and/or strings over a group having lesser number of
keywords/strings. In one embodiment, certain keywords and strings may be assigned
weights or scores and the groups having such keywords and strings may be assigned
higher priority.
5
[0052] The processing unit 218 may then compare the first ranked group with the
plurality of clusters 214 to form a similarity matrix. The processing unit 218 may
determine the relationship type between the users by identifying the cluster corresponding
to the highest value in the similarity matrix. In another embodiment, the processing unit
10 may use a predefined number of groups from the ranked list of groups for determining
the relationship type between the users.
[0053] It may be noted here that the relationship type determined between the users may
change with time. Thus, in one non-limiting aspect of present invention, the processing
15 unit 218 may continuously monitor interactions between the users and dynamically
redefine/update the relationship type between the users based on any deviation in the
extracted keywords and/or strings.
[0054] The embodiments of the present invention are now described with the help of an
20 exemplary scenario, where the interaction data between two users may comprise one or
more of the following messages, but not limited to: “I love you”, “I love holding
you/being in your arms”, “You are my soulmate”, “You make every day feel like
Valentine’s Day”, “I hate you”, “I miss your kisses”. According to an exemplary
embodiment, the processing unit 218 may extract keywords and/or strings from the
25 interaction data such as “love” “miss”, “kiss”, “holding in arms” “soulmate” “hate”
“Valentine’s day”, but not limited thereto. Consider that the processing unit forms three
clusters: C1 for romantic relationship, C2 for friendship, and C3 for enemy/rivalry
relationship. The comparing unit 220 may compare the extracted keywords and/or strings
with the plurality of clusters (C1, C2, C3) and may determine a similarity matrix based
30 on the comparison.
[0055] Now the similarity matrix may indicate that most of the extracted keywords
and/or strings are matching with cluster C1 and only a few keywords and/or strings are
matching with clusters C2 and C3 i.e., highest similarity is with cluster C1 (which
15
corresponds to romantic relationship). Thus, the determining unit 222 determines the
relationship type to be “romantic.”
[0056] Thus, the system may determine the accurate and realistic nature of relationship
5 between the users based on the interactions between the users. Further, the system may
take into account the identified nature of relationship between the users and may
personalize and recommend one or more of media items such as, but not limited to, text,
emoji, sticker, emoticons, animations, and audio-visual media, etc. The user may use the
personalized and recommended medias in the conversation to realistically and accurately
10 express emotions, feelings, and sentiments during the on-going communications. This
way the user experience is enhanced.
[0057] Figure 3 is a flow chart representing an exemplary method 300 for determining
a relationship type between users based on interaction data, according to an embodiment
15 of the present disclosure. The method may be performed by the system 200 in conjunction
with the various units described in figure 2. The method 300 is merely provided for
exemplary purposes, and embodiments are intended to include or otherwise cover any
methods or procedures for determining a relationship type between users.
20 [0058] As illustrated in figure 3, the method 300 includes one or more blocks illustrating
a method to dynamically determine a relationship type between users.
[0059] The order in which the method 300 is described is not intended to be construed
as a limitation, and any number of the described method blocks can be combined in any
25 order to implement the method. Additionally, individual blocks may be deleted from the
methods without departing from the scope of the subject matter described herein.
Furthermore, the method can be implemented in any suitable hardware, software,
firmware, or combination thereof.
30 [0060] At block 302, a monitoring unit 216 may monitor one or more interactions
between the users. In a preferred embodiment, the monitoring unit 216 monitors
interactions between two users. The interactions may comprise one or more of the
following, but not limited to, text, audio, video, emoji, stickers, animations, images, and
audio-visual media etc.
35
16
[0061] At block 304, a processing unit 218 may extract key features from the one or
more interactions. The feature extraction may comprise extracting relevant information
from the interactions such as, but not limited to, keywords and/or strings. In one
embodiment, the one or more interactions for feature extraction may be selected based on
5 one or more parameters. The one or more parameters may include, but not limited to, a
number of interactions, a type of interaction, a time frame, and a size of interaction.
[0062] At block 306, a comparing unit 220 may compare each of the extracted features
(i.e., keywords and/or strings) with a plurality of clusters 214, where the plurality of
10 clusters is formed based on previous interactions and each of the plurality of clusters
corresponds to a particular relationship type.
[0063] At block 308, a determining unit 222 may determine the relationship type
between the users by identifying a cluster from the plurality of clusters based on highest
15 similarity between the plurality of clusters and the extracted keywords and/or strings. In
one embodiment, the determining unit 222 may determine a similarity matrix based on
the comparison of the extracted keywords and/or strings with the plurality of clusters. The
determining unit 222 may then determine the relationship type between the users by
identifying a cluster corresponding to the highest value in the similarity matrix.
20
[0064] In an embodiment, the method 300 may comprise forming one or more groups
from the extracted keywords and/or strings based on predefined factors such as, but not
limited to, synonyms, context, tone, and expressions and ranking the formed groups based
on predetermined conditions. These groups may then be used for determining the
25 relationship type between the users.
[0065] In another embodiment, the method 300 may comprise continuously monitoring
interactions between the users and dynamically redefining/updating the relationship type
based on any deviation in the extracted keywords and/or strings.
30
[0066] In some embodiments, the plurality of clusters 214 may be formed using pre-fed
keywords and/or strings and may be continuously updated based on interactions among
the users. In some embodiments, the processing unit 218 may generate a plurality of
interactions by processing interaction data of a plurality of users and may split the
35 plurality of interactions into one or more datasets, where each dataset may comprise a
17
predefined number of interactions. The processing unit 218 may then generate the
plurality of clusters by processing the one or more datasets such that similar keywords
and/or strings are grouped together. Each of the plurality of clusters may be annotated into
human understandable language. For example, the plurality of clusters may be annotated
5 by a type of relationship such as romantic, friendship, formal etc., but not limited thereto.
Each cluster may comprise information such as, but not limited to, a nature or type of
relationship, one or more interactions corresponding to the relationship type, one or more
keywords and/or strings, similarity matrices etc.
10 [0067] Accordingly, from the above disclosure, it may be worth noting that the present
disclosure provides an accurate, convenient, and efficient technique for determining a
relationship type between users based on interaction data.
Computer System
15 [0068] Figure 4 illustrates a block diagram of an exemplary computer system 400 for
implementing embodiments consistent with the present invention. In an embodiment, the
computer system 400 can be the system 200 which is used for determining a relationship
type between users. According to an embodiment, the computer system 400 may receive
interaction data 410 which may include, for example, message/chat logs, emojis, audio
20 and video data etc. The computer system 400 may comprise a central processing unit
(“CPU” or “processor”) 402. The processor 402 may comprise at least one data processor
for executing program components for executing user- or system-generated business
processes. The processor 402 may include specialized processing units such as integrated
system (bus) controllers, memory management control units, floating point units, graphics
25 processing units, digital signal processing units, etc.
[0069] The processor 402 may be disposed in communication with one or more
input/output (I/O) devices (411 and 412) via I/O interface 401. The I/O interface 401 may
employ communication protocols/methods such as, without limitation, audio, analog,
30 digital, stereo, IEEE-1394, serial bus, Universal Serial Bus (USB), infrared, PS/2, BNC,
coaxial, component, composite, Digital Visual Interface (DVI), high-definition
multimedia interface (HDMI), Radio Frequency (RF) antennas, S-Video, Video Graphics
Array (VGA), IEEE 802.n /b/g/n/x, Bluetooth, cellular (e.g., Code-Division Multiple
Access (CDMA), High-Speed Packet Access (HSPA+), Global System For Mobile
18
Communications (GSM), Long-Term Evolution (LTE) or the like), etc. Using the I/O
interface 401, the computer system 400 may communicate with one or more I/O devices
(411 and 412).
5 [0070] In some embodiments, the processor 402 may be disposed in communication with
a communication network 409 via a network interface 403. The network interface 403
may communicate with the communication network 409. The network interface 403 may
employ connection protocols including, without limitation, direct connect, Ethernet (e.g.,
twisted pair 10/100/1000 Base T), Transmission Control Protocol/Internet Protocol
10 (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc. The communication network 409 can be
implemented as one of the different types of networks, such as intranet or Local Area
Network (LAN) and such within the organization. The communication network 409 may
either be a dedicated network or a shared network, which represents an association of the
different types of networks that use a variety of protocols, for example, Hypertext Transfer
15 Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless
Application Protocol (WAP), etc., to communicate with each other. Further, the
communication network 409 may include a variety of network devices, including routers,
bridges, servers, computing devices, storage devices, etc.
20 [0071] In some embodiments, the processor 402 may be disposed in communication with
a memory 405 (e.g., RAM 413, ROM 414, etc. as shown in FIG. 4) via a storage interface
404. The storage interface 404 may connect to memory 405 including, without limitation,
memory drives, removable disc drives, etc., employing connection protocols such as
Serial Advanced Technology Attachment (SATA), Integrated Drive Electronics (IDE),
25 IEEE-1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems
Interface (SCSI), etc. The memory drives may further include a drum, magnetic disc
drive, magneto-optical drive, optical drive, Redundant Array of Independent Discs
(RAID), solid-state memory devices, solid-state drives, etc.
30 [0072] The memory 405 may store a collection of program or database components,
including, without limitation, user/application data 406, an operating system 407, web
browser 408 etc. In some embodiments, the computer system 400 may store
user/application data 406, such as the data, variables, records, etc. as described in this
19
invention. Such databases may be implemented as fault-tolerant, relational, scalable,
secure databases such as Oracle or Sybase.
[0073] The operating system 407 may facilitate resource management and operation of
5 the computer system 400. The operating system may be any operating system. I/O
interface 401 may facilitate display, execution, interaction, manipulation, or operation of
program components through textual or graphical facilities. For example, I/O interface
may provide computer interaction interface elements on a display system operatively
connected to the computer system 400, such as cursors, icons, check boxes, menus,
10 windows, widgets, etc.
[0074] In some embodiments, the computer system 400 may implement a web browser
408 stored program component. In some embodiments, the computer system 400 may
implement a mail server stored program component. The mail server 416 may be an
15 Internet mail server. The mail server 416 may utilize communication protocols such as
Internet Message Access Protocol (IMAP), Messaging Application Programming
Interface (MAPI), Microsoft Exchange, Post Office Protocol (POP), Simple Mail Transfer
Protocol (SMTP), or the like. In some embodiments, the computer system 400 may
implement a mail client 415 stored program component. The mail client 415 may be any
20 mail viewing application.
[0075] In some embodiments, one or more computer-readable storage media may be
utilized in implementing embodiments consistent with the present invention. A computerreadable storage medium refers to any type of physical memory on which information or
25 data readable by a processor may be stored. Thus, a computer-readable storage medium
may store instructions for execution by one or more processors, including instructions for
causing the processor(s) to perform steps or stages consistent with the embodiments
described herein. The term “computer-readable medium” should be understood to include
tangible items and exclude carrier waves and transient signals, i.e., non-transitory.
30 Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile
memory, nonvolatile memory, hard drives, Compact Disc (CD) ROMs, Digital Video Disc
(DVDs), flash drives, disks, and any other known physical storage media.
20
[0076] One or more processors may include, by way of example, a general-purpose
processor, a special purpose processor, a conventional processor, a digital signal processor
(DSP), a plurality of microprocessors, one or more microprocessors in association with a
DSP core, a controller, a microcontroller, Application Specific Integrated Circuits
5 (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated
circuit (IC), and/or a state machine.
[0077] In some embodiments, a computer program product may be utilized in
implementing embodiments consistent with the present disclosure. The computer
10 program product may comprise instructions which, when the program is executed by a
computer system, may cause the computer system to implement the embodiments
consistent with the present disclosure.
[0078] The terms “an embodiment”, “embodiment”, “embodiments”, “the embodiment”,
15 “the embodiments”, “one or more embodiments”, “some embodiments”, and “one
embodiment” mean “one or more (but not all) embodiments of the invention(s)” unless
expressly specified otherwise.
[0079] A description of an embodiment with several components in communication with
20 each other does not imply that all such components are required. On the contrary, a variety
of optional components are described to illustrate the wide variety of possible
embodiments of the invention.
[0080] While various aspects and embodiments have been disclosed herein, other aspects
25 and embodiments will be apparent to those skilled in the art. The various aspects and
embodiments disclosed herein are for purposes of illustration and are not intended to be
limiting, with the true scope and spirit being indicated by the following claims.
WE CLAIM:
1. A method (300) of determining a relationship type between users (112), the method
(300) comprising:
5 monitoring (302) one or more interactions between the users (112);
extracting (304) one or more keywords and/or strings from the one or more interactions;
comparing (306) each of the extracted keywords and/or strings with a plurality of
clusters; and
determining (308) the relationship type between the users (112) by identifying a cluster
10 from the plurality of clusters based on highest similarity, wherein each of the plurality of clusters
corresponds to a particular relationship type.
2. The method as claimed in claim 1, wherein determining the relationship type comprises:
determining a similarity matrix based on the comparison of the extracted keywords
15 and/or strings with the plurality of clusters; and
determining the relationship type between the users by identifying the cluster
corresponding to the highest value in the similarity matrix.
3. The method as claimed in claim 1, further comprising:
20 forming one or more groups from the extracted keywords and/or strings based on
predefined factors including synonyms, context, tone, and expressions; and
ranking the formed groups based on predetermined conditions.
4. The method as claimed in claim 1, wherein for extracting the keywords and/or strings,
25 the one or more interactions are selected based on one or more parameters including: a number
of interactions, a type of interaction, a time frame, and a size of interactions.
5. The method as claimed in claim 1, further comprising continuously monitoring
interactions between the users and dynamically redefining the relationship type based on any
30 deviation in the extracted keywords and/or strings.
6. The method as claimed in claim 1, wherein the plurality of clusters is formed based on
keywords and/or strings which are pre-fed, and wherein the plurality of clusters is continuously
updated based on the one or more interactions.
22
7. The method as claimed in claim 1, wherein the plurality of clusters is generated by:
generating a plurality of interactions by processing interaction data of a plurality of
users;
5 splitting the plurality of interactions into one or more datasets, wherein each dataset
comprises a predefined number of interactions; and
generating the plurality of clusters by processing the one or more datasets such that
similar keywords and/or strings are clustered together.
10 8. The method as claimed in claim 7, further comprising assigning a relationship type to
each cluster of the generated clusters.
9. A system (200) for determining a relationship type between users (112), the system (200)
comprising:
15 a monitoring unit (216) configured to monitor one or more interactions between the
users (112);
a processing unit (218) configured to extract one or more keywords and/or strings from
the one or more interactions;
a comparing unit (220) configured to compare each of the extracted keywords and/or
20 strings with a plurality of clusters; and
a determining unit (222) configured to determine the relationship type between the users
(112) by identifying a cluster from the plurality of clusters based on highest similarity, wherein
each of the plurality of clusters corresponds to a particular relationship type.
25 10. The system as claimed in claim 9, wherein to determine the relationship type, the
determining unit is configured to:
determine a similarity matrix based on the comparison of the extracted keywords and/or
strings with the plurality of clusters; and
determine the relationship type between the users by identifying the cluster
30 corresponding to the highest value in the similarity matrix.
11. The system as claimed in claim 9, wherein the processing unit is further configured to:
form one or more groups from the extracted keywords and/or strings based on
predefined factors including synonyms, context, tone, and expressions; and
23
rank the formed groups based on predetermined conditions.
12. The system as claimed in claim 9, wherein for extracting the keywords and/or strings,
the processing unit is configured to select the one or more interactions based on one or more
5 parameters including: a number of interactions, a type of interaction, a time frame, and a size
of interactions.
13. The system as claimed in claim 9, wherein the processing unit is further configured to
continuously monitor interactions between the users and dynamically redefine the relationship
10 type based on any deviation in the extracted keywords and/or strings.
14. The system as claimed in claim 9, wherein the plurality of clusters is formed based on
keywords and/or strings which are pre-fed, and wherein the plurality of clusters is continuously
updated based on the one or more interactions.
15
15. The system as claimed in claim 9, wherein the processing unit is further configured to
generate the plurality of clusters by:
generating a plurality of interactions by processing interaction data of a plurality of
users;
20 splitting the plurality of interactions into one or more datasets, wherein each dataset
comprises a predefined number of interactions; and
generating the plurality of clusters by processing the one or more datasets such that
similar keywords and/or strings are clustered together.
| # | Name | Date |
|---|---|---|
| 1 | 202011000464-FER.pdf | 2025-04-01 |
| 1 | 202011000464-FORM 18 [25-10-2023(online)].pdf | 2023-10-25 |
| 1 | 202011000464-STATEMENT OF UNDERTAKING (FORM 3) [06-01-2020(online)].pdf | 2020-01-06 |
| 2 | 202011000464-PROVISIONAL SPECIFICATION [06-01-2020(online)].pdf | 2020-01-06 |
| 2 | 202011000464-FORM 18 [25-10-2023(online)].pdf | 2023-10-25 |
| 2 | 202011000464-COMPLETE SPECIFICATION [04-01-2021(online)].pdf | 2021-01-04 |
| 3 | 202011000464-POWER OF AUTHORITY [06-01-2020(online)].pdf | 2020-01-06 |
| 3 | 202011000464-CORRESPONDENCE-OTHERS [04-01-2021(online)].pdf | 2021-01-04 |
| 3 | 202011000464-COMPLETE SPECIFICATION [04-01-2021(online)].pdf | 2021-01-04 |
| 4 | 202011000464-FORM 1 [06-01-2020(online)].pdf | 2020-01-06 |
| 4 | 202011000464-DRAWING [04-01-2021(online)].pdf | 2021-01-04 |
| 4 | 202011000464-CORRESPONDENCE-OTHERS [04-01-2021(online)].pdf | 2021-01-04 |
| 5 | 202011000464-DRAWING [04-01-2021(online)].pdf | 2021-01-04 |
| 5 | 202011000464-DRAWINGS [06-01-2020(online)].pdf | 2020-01-06 |
| 5 | 202011000464-Proof of Right [07-02-2020(online)].pdf | 2020-02-07 |
| 6 | 202011000464-DECLARATION OF INVENTORSHIP (FORM 5) [06-01-2020(online)].pdf | 2020-01-06 |
| 6 | 202011000464-Proof of Right [07-02-2020(online)].pdf | 2020-02-07 |
| 6 | abstract.jpg | 2020-01-17 |
| 7 | 202011000464-DECLARATION OF INVENTORSHIP (FORM 5) [06-01-2020(online)].pdf | 2020-01-06 |
| 7 | abstract.jpg | 2020-01-17 |
| 8 | 202011000464-DECLARATION OF INVENTORSHIP (FORM 5) [06-01-2020(online)].pdf | 2020-01-06 |
| 8 | 202011000464-DRAWINGS [06-01-2020(online)].pdf | 2020-01-06 |
| 8 | 202011000464-Proof of Right [07-02-2020(online)].pdf | 2020-02-07 |
| 9 | 202011000464-DRAWING [04-01-2021(online)].pdf | 2021-01-04 |
| 9 | 202011000464-DRAWINGS [06-01-2020(online)].pdf | 2020-01-06 |
| 9 | 202011000464-FORM 1 [06-01-2020(online)].pdf | 2020-01-06 |
| 10 | 202011000464-CORRESPONDENCE-OTHERS [04-01-2021(online)].pdf | 2021-01-04 |
| 10 | 202011000464-FORM 1 [06-01-2020(online)].pdf | 2020-01-06 |
| 10 | 202011000464-POWER OF AUTHORITY [06-01-2020(online)].pdf | 2020-01-06 |
| 11 | 202011000464-COMPLETE SPECIFICATION [04-01-2021(online)].pdf | 2021-01-04 |
| 11 | 202011000464-POWER OF AUTHORITY [06-01-2020(online)].pdf | 2020-01-06 |
| 11 | 202011000464-PROVISIONAL SPECIFICATION [06-01-2020(online)].pdf | 2020-01-06 |
| 12 | 202011000464-STATEMENT OF UNDERTAKING (FORM 3) [06-01-2020(online)].pdf | 2020-01-06 |
| 12 | 202011000464-PROVISIONAL SPECIFICATION [06-01-2020(online)].pdf | 2020-01-06 |
| 12 | 202011000464-FORM 18 [25-10-2023(online)].pdf | 2023-10-25 |
| 13 | 202011000464-STATEMENT OF UNDERTAKING (FORM 3) [06-01-2020(online)].pdf | 2020-01-06 |
| 13 | 202011000464-FER.pdf | 2025-04-01 |
| 14 | 202011000464-FORM 3 [02-06-2025(online)].pdf | 2025-06-02 |
| 15 | 202011000464-OTHERS [29-09-2025(online)].pdf | 2025-09-29 |
| 16 | 202011000464-FER_SER_REPLY [29-09-2025(online)].pdf | 2025-09-29 |
| 1 | 202011000464_searchE_22-06-2024.pdf |