Sign In to Follow Application
View All Documents & Correspondence

A System And Method Of Embedding A Creative Content With An Image

Abstract: Disclosed herein is a system and method for embedding a creative content with an image. The system monitors a current conversation between a pair of users of a plurality of users and determines current conversation data based on monitoring of the current conversation, the current conversation data comprising messages and corresponding responses captured during the current conversation. It applies a Natural language processing technique upon current conversation data to determine a current context of the current conversation between the pair of users and applies pre-learnt relationship model upon the current context to determine an intent of current conversation between the pair of users, the pre-learnt relationship model indicates a relationship between one user with another user of the pair of users. It identifies a creative content based on the intent and embeds creative content with image of at least one of the pair of users to generate an emoji.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
06 January 2020
Publication Number
36/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ipo@knspartners.com
Parent Application

Applicants

HIKE PRIVATE LIMITED
4th Floor, Indira Gandhi International Airport, Worldmark 1, Northern Access Rd, Aerocity, New Delhi, Delhi 110037, India

Inventors

1. Dipankar Sarkar
4th Floor, Indira Gandhi International Airport, Worldmark 1, Northern Access Rd, Aerocity, New Delhi, Delhi 110037, India
2. Kavin Bharti Mittal
4th Floor, Indira Gandhi International Airport, Worldmark 1, Northern Access Rd, Aerocity, New Delhi, Delhi 110037, India

Specification

TECHNICAL FIELD
[0001] The present disclosure relates to instant messaging, and more particularly to a
system and method for understanding human relationship and generating emoji.
5 BACKGROUND
[0002] Consumer use of messaging has shown remarkable growth over the past several
years. At the same time, consumer use of emoji/Avatars and other digital media in one-toone messaging systems has shown the strongest growth in the category. Avatars/emoji helps
express the real emotional reaction of a user while communicating in instant messaging
10 services. With the advancement of technology, a user is now able to generate avatars/emoji
from his selfie image.
[0003] For the generation of avatars/emoji through selfie image, a user clicks an image and
sends it to a server. The server uses several techniques to generate avatars/emoji from the
15 received user image.
[0004] However, the existing techniques allow a user, the broadcaster, interacting with an
apparatus via a message editor, to select and embed an emoji from a palette consisting fixed
or a pre-existing number of emojis into the message editor of the messaging client and
20 transmit the message to a plurality of users, the recipients.
[0005] As used herein, the term "emoji palette" means a segment of a user interface module
in messaging and instant messaging client applications that presents a user interface and
used to display and present selectable emoji to the user. Characteristically, these emojis
25 have pre-defined set of attributes such as shape, size and color and are retrieved from a
local or remote emoticon object store.
[0006] Also, the existing techniques do not take into consideration the intent of a user
having a conversation with another user, while creating emojis. It is really challenging to
3
create a sticker with additional content which can sync up with the intent of the user.
Determining the correct intent becomes critical when suggesting a content to the users, for
example any advertisement. There may be variety of products and their advertisements
which may be provided to the users while chatting. However, the challenge lies in how to
5 decide “which” advertisement to be selected and “when” to be provided to the users. And,
before deciding the advertisement, it is important to correctly determine the intent of the
chat between the users. In conventional technique, the intent may be determined using the
message content. However, this technique may still lack correctness because there may be
several factors in the background which may be not considered. As the chatting happens in
10 a real-time, technical challenge lies in how to correlate the current data vis-à-vis various
factors associated with the users. The users may have a different set of relationships
amongst them which may range from professional to friend to family and so on. For
example, father-son, father-daughter, manager-employee, school friends, college friends
and the like. Hence, it becomes technically challenging to understand the human
15 relationship before suggesting any advertisement.
OBJECT OF THE INVENTION
[0007] An object of the present disclosure is to correctly understand relationship between
the users before deciding upon a creative content (advertisement).
20
[0008] Another object of the present disclosure is to embed the creative content with user’s
image to generate emoji.
SUMMARY
25 [0009] The present disclosure overcomes one or more shortcomings of the prior art and
provides additional advantages discussed throughout the present disclosure. Additional
features and advantages are realized through the techniques of the present disclosure. Other
embodiments and aspects of the disclosure are described in detail herein and are considered
a part of the claimed disclosure.
30
4
[0010] In an embodiment of the present disclosure, method of embedding a creative
content with an image is disclosed. The method comprises monitoring a current
conversation between a pair of users of a plurality of users. The method further comprises
determining current conversation data based on the monitoring of the current conversation.
5 The current conversation data comprises messages and corresponding responses captured
during the current conversation. The method further comprises applying a Natural language
processing (NLP) technique upon the current conversation data to determine a current
context of the current conversation happening between the pair of users. The method
further comprises applying pre-learnt relationship model upon the current context to
10 determine an intent of the current conversation between the pair of users. The pre-learnt
relationship model indicates a relationship between one user with another user of the pair
of users. The method further comprises identifying a creative content based on the intent.
The method further comprises embedding the creative content with image of at least one
of the pair of users to generate an emoji.
15
[0011] In another embodiment of the present disclosure, a system for embedding a creative
content with an image is disclosed. The system comprises a monitoring unit to monitor a
current conversation between a pair of users of the plurality of users. The system further
comprises a determining unit to determine current conversation data based on monitoring
20 of the current conversation. The current conversation data comprises messages and
corresponding responses captured during the current conversation. The system further
comprises an applying unit to apply a Natural language processing (NLP) technique upon
the current conversation data to determine a current context of the current conversation
happening between the pair of users. The applying unit further applies pre-learnt
25 relationship model upon the current context to determine an intent of the current
conversation between the pair of users, wherein the pre-learnt relationship model indicates
a relationship between one user with another user of the pair of users. The system further
comprises an identifying unit to identify a creative content based on the intent. The system
5
also comprises a content embedding unit to embed the creative content with image of at
least one of the pair of users to generate an emoji.
[0012] The foregoing summary is illustrative only and is not intended to be in any way
5 limiting. In addition to the illustrative aspects, embodiments, and features described above,
further aspects, embodiments, and features will become apparent by reference to the
drawings and the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
10 [0013] The accompanying drawings, which are incorporated in and constitute a part of this
disclosure, illustrate exemplary embodiments and, together with the description, serve to
explain the disclosed embodiments. In the figures, the left-most digit(s) of a reference
number identifies the figure in which the reference number first appears. The same numbers
are used throughout the figures to reference like features and components. Some
15 embodiments of system and/or methods in accordance with embodiments of the present
subject matter are now described, by way of example only, and with reference to the
accompanying figures, in which:
[0014] Figure 1A shows an environment 100 for embedding a creative content with an
20 image, in accordance with an embodiment of the present disclosure;
[0015] Figure 1B shows an example of chat between two users and generating of the emoji;
[0016] Figure 2 shows a block diagram 200 illustrating a system for embedding a creative
25 content with an image, in accordance with an embodiment of the present disclosure; and
[0017] Figure 3 shows a method 300 for embedding a creative content with an image, in
accordance with an embodiment of the present disclosure.
30 [0018] The figures depict embodiments of the disclosure for purposes of illustration only.
One skilled in the art will readily recognize from the following description that alternative
6
embodiments of the structures and methods illustrated herein may be employed without
departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
5 [0019] In the present document, the word "exemplary" is used herein to mean "serving as
an example, instance, or illustration." Any embodiment or implementation of the present
subject matter described herein as "exemplary" is not necessarily to be construed as
preferred or advantageous over other embodiments.
10 [0020] In the following detailed description of the embodiments of the disclosure,
reference is made to the accompanying drawings that form a part hereof, and in which are
shown by way of illustration specific embodiments in which the disclosure may be
practiced. These embodiments are described in sufficient detail to enable those skilled in
the art to practice the disclosure, and it is to be understood that other embodiments may be
15 utilized and that changes may be made without departing from the scope of the present
disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0021] Disclosed herein is a system and method for embedding a creative content with an
image. When people chat with each other they share their feeling emotions, information.
20 The level of such sharing varies from one person to another person depending upon various
relationship factors. For example, how much they know each other, whether they share a
professional relationship or a friends or a family. Even within these relationships there may
be multiple levels like school time friends, college time friends, professional relationship
turned into friendship and so on. And therefore it becomes quite difficult to understand the
25 relationship and intent between two people merely by using their chat details. Correctly
understanding the relationship becomes crucial when something has to be implemented
based on that. The present disclosure aims to first select a suitable creative content (i.e.,
advertisement) after correctly understanding the intent and then create an emoji by
embedding that advertisement into user’s image. The benefit of creating such emoji may
30 be to provide a real world experience for the users. Another benefit may be to help the user
7
in decision making whether to go with product related to the displayed advertisement or
not. How the above benefits are achieved are explained in upcoming paragraphs of the
specification.
5 [0022] Figure 1A shows an exemplary environment 100 for embedding a creative content
with an image, in accordance with an embodiment of the present disclosure. It must be
understood to a person skilled in art that the system may also be implemented in various
environments, other than as shown in Fig. 1A.
10 [0023] The detailed explanation of the exemplary environment 100 is explained in
conjunction with Figure 2 that shows a block diagram 200 of a system 202 for embedding
a creative content with an image, in accordance with an embodiment of the present
disclosure. Although the present disclosure is explained considering that the system 202 is
implemented on a server, it may be understood that the system 202 may be implemented
15 in a variety of computing systems, such as a laptop computer, a desktop computer, a
notebook, a workstation, a mainframe computer, a server, a network server, a cloud-based
computing environment. It may be understood that the system 202 may be accessed by
multiple users through one or more user devices 224 or applications residing on the user
devices. In one implementation, the system 202 may comprise the cloud-based computing
20 environment in which a user may operate individual computing systems configured to
execute remotely located applications. Examples of the user devices 224 may include, but
are not limited to, a IoT device, IoT gateway, portable computer, a personal digital assistant,
a handheld device, and a workstation. The user devices 224 are communicatively coupled
to the system 202 through a network 222.
25
[0024] In one implementation, the network 222 may be a wireless network, a wired
network or a combination thereof. The network 222 can be implemented as one of the
different types of networks, such as intranet, local area network (LAN), wide area network
(WAN), the internet, and the like. The network 222 may either be a dedicated network or a
30 shared network. The shared network represents an association of the different types of
8
networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP),
Hypertext Transfer Protocol Secure (HTTPS), Transmission Control Protocol/Internet
Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate
with one another. Further the network 222 may include a variety of network devices,
5 including routers, bridges, servers, computing devices, storage devices, and the like.
[0025] In one implementation, the system 202 may comprise an I/O interface 204, a
processor 206, a memory 208 and the units 210. The memory 208 may be communicatively
coupled to the processor 206 and the units 210. The processor 206 may be implemented as
10 one or more microprocessors, microcomputers, microcontrollers, digital signal processors,
central processing units, state machines, logic circuitries, and/or any devices that
manipulate signals based on operational instructions. Among other capabilities, the
processor 206 is configured to fetch and execute computer-readable instructions stored in
the memory 208. The I/O interface 204 may include a variety of software and hardware
15 interfaces, for example, a web interface, a graphical user interface, and the like. The I/O
interface 204 may allow the system 202 to interact with the user directly or through the
user devices 224. Further, the I/O interface 204 may enable the system 202 to communicate
with other computing devices, such as web servers and external data servers (not shown).
The I/O interface 204 can facilitate multiple communications within a wide variety of
20 networks and protocol types, including wired networks, for example, LAN, cable, etc., and
wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include
one or more ports for connecting many devices to one another or to another server.
[0026] In one implementation, the units 210 may comprise a monitoring unit 212, a
25 determination unit 214, a relationship generation unit 216, an identification unit 218, and
a content embedding unit 220. According to embodiments of present disclosure, these units
212-220 may comprise hardware components like processor, microprocessor,
microcontrollers, application-specific integrated circuit for performing various operations
of the system 202. It must be understood to a person skilled in art that the processor 206
9
may also perform all the functions of the units 212-220 according to various embodiments
of the present disclosure.
[0027] Now referring back to figure 1A, it can be observed that a user “John” is interacting
5 with other users “Smith”, “Lucy” and “Merry”. At the right-hand side of Figure 1A, a
relationship barometer with an pointer (arrow in downward direction) representing the past
relationship of John with Smith, Lucy and Merry. The pointer helps the system 202
understand biasness of the relationship. For example, first relationship barometer shows
that relationship between John and Smith is more towards a family. Whereas, the third
10 relationship barometer shows that the relationship between John and Merry is more
towards the professional relationship. It must be understood to a skilled person that the
pointer may shift on either side depending upon the development in the relationship
between the users. For ease of understanding, the relationship barometer shows some
standard relationships like professional, friends, and family. However, it may be
15 understood to a skilled person that the present disclosure may be implemented on various
other types of relationships not shown in figure 1A.
[0028] Now, following description explains the embodiments of the invention referring to
interaction between John and Merry only. However, it may be understood to a skilled
20 person that the above-mentioned scenario is merely an example, and there may be multiple
other scenarios in which the present disclosure may be implemented. For example, the user
may be interacting with multiple users in a group chat interface.
[0029] Initially, the system 202 may be trained based on the past conversation of the users
25 to understand the relationship between them. As shown in figure 1A, John is interacting
with Smith, Lucy and Merry. The user data related to the users John, Smith, Merry and
Lucy may be created and stored in the memory (208) of the system (202). The user data
may comprise of age, name, gender, demographic, and profession associated with each
user. The demographic data may comprise of location, address, marital status, interests of
10
user such as in sports, movies, social media, vacation spots etc. The relationship generation
unit 216 receives the user data form the memory (208).
[0030] According to an embodiment, the relationship generation unit 216 may first receive
5 the plurality of user data corresponding to the plurality of users (John, Smith, Lucy, and
Merry). This helps the system 202 to have a fair understanding about the background of
each user. Then, the relationship generation unit 216 may monitor a plurality of past
conversation data based on past conversation happened between the plurality of users for
a predefined time interval. The past conversation data may correspond to at least a pair of
10 users of the plurality of users. For example, the past conversation data may correspond to
the past conversation happened between John and Smith or John and Lucy or John and
Merry. The past conversation data may comprise of textual data, audio data, video audio,
and graphical data. Each past conversation data indicates multiple instances of past
conversation between the pair of users during the predefined time interval. The past
15 conversation data may be stored in the memory (208).
[0031] Referring to John and Merry example, the relationship generation unit 216 may
monitor the past conversation happened between John and Merry in last one month or 6
months or any predefined time interval. It may be understood to a skilled person that the
20 predefined time interval may vary from few hours to several days, weeks, months or year.
Now once the past conversation data is collected, in next step, the relationship generation
unit 216 may apply a Natural language processing (NLP) technique upon the past
conversation data to determine a context of each instance of the past conversation happened
between the pair of users. Further, the context is analyzed relative to the user data of the
25 pair of users. Based on the analyzing, the relationship generation unit 216 generates a prelearnt relationship model for John and Merry. Similarly, the pre-learnt relationship model
may be generated on analyzing the past conversation and considering the user data for
different combinations - John and Lucy or John and Smith. As discussed above, the prelearnt relationship model generated for each of the combination/pair of users is shown
11
using a relationship barometer at the right-hand side of Figure 1A. The pre-learnt
relationship model may be stored in the memory (208).
[0032] Now, let us consider that John and Merry have been interacting with each other for
5 last one month (predefined time period). John and Merry both are professionals and work
in the same company. The relationship generation unit 216 monitors each conversation
happened between John and Merry in last one month. While monitoring, the relationship
generation unit 216 may also monitor the day, date, and time of the conversation. Further,
the relationship generation unit 216 may also monitor the format of the conversation data
10 such as whether the data is a video, audio, text or image or combination of all these. Then,
the NLP technique may be applied to determine context of each instance of the past
conversation.
[0033] For example, John and Merry work in the same company and they have talked about
15 work in the past. The relationship generation unit 216 applies NLP technique on each
conversation data between John and Merry and determines that the context of the
conversation between John and Merry was related to their “profession” or “work”. In an
exemplary embodiment, the relationship generation unit 216 may determine this context
based on words/sentences used in the conversation such as meeting, office, project,
20 presentation, boss, colleagues, deadlines etc. After determining the context, the relationship
generation unit 216 may analyze the context relative to the user data of the pair of users. In
the exemplary scenario, the relationship generation unit 216 analyses the context i.e.
profession or work, with the user data of John and Merry. The user data of John and Merry
shows that both work in the same company and belongs to similar age group, for example
25 25-28 years. The relationship generation unit 216 may determine a relationship level
between John and Merry based on their user data and the context. As shown in figure 1A,
the relationship level may have three levels which may include, but not limited to,
professional, friend and family. Based on the above analysis of context and user data, the
relationship generation unit 216 determines that the relationship level between John and
12
Merry is more towards the professional relationship as indicated by the pointer in figure
1A.
[0034] Now once the system 202 gets trained and understand the relationship between John
5 and Merry, it may be implemented in real-time. The upcoming paragraphs will explain in
detail how the system 202 utilizes its understanding about the past relationship between
the pair of users (John and Merry in this case) to determine the intent, and thereafter
generating an emoji using a suitable creative content (advertisement).
10 [0035] In an exemplary embodiment, the monitoring unit 212 monitors the current
conversation between John and Merry. Then, the determining unit 214 determines current
conversation data based on the monitoring. In an embodiment, the current conversation
data comprises messages and corresponding responses captured during the current
conversation. As an example shown in figure 1B, it can be observed that John has asked
15 Merry “Let us meet for a drink today”. In response, Merry has said “Yes”. The conversation
data in this case may be determined as “meet for a drink today” and “yes”. It should be
apparent to a person skilled in the art that the above is just an example of possible
conversation between users. After determining the conversation data, the relationship
generation unit 216 applies the Natural language processing (NLP) technique upon the
20 current conversation data to determine a current context of the current conversation
happening between John and Merry. In this case, the relationship generation unit 216
determines that Merry has accepted John’s proposal for having a drink. However, the
technical challenge comes here is what kind of drink they both are referring to? Whether
they want to go for a coffee or whether they want to go for a beer. This is because, merely
25 using the messages content it may be difficult to predict what kind of drink has been
referred here. The present disclosure addresses this technical challenge by using the prelearnt relationship model (as discussed above) and applying the same on the current
conversation data.
13
[0036] For example, the relationship generation unit 216 may apply the pre-learnt
relationship model upon the current context to determine an intent of the current
conversation between John and Merry. As discussed above, the relationship generation unit
216 has determined that the relationship level between John and Merry is “professional”.
5 Considering John and Merry being as professional, the relationship generation unit 216
determines the intent that kind of drink referred in their chat may be related to “coffee” or
“tea” but not beer or any hard drink.. However, it may so happen that the relationship level
between John and Merry may shift towards “friendship level”, and therefore the intent may
be determined as “having a beer”.
10
[0037] The above change in their relationship may happen based on the change in their
conversation. For example, John and Merry may now have started to talk about their
personal life and the conversation between them show this change. For example, John and
Merry may have started talking about their families and friends. The conversation data may
15 include words like friend, family, mother, father, interest, like, movie etc. The relationship
generation unit 216 continuously monitors the conversation and determine the context of
the conversation being personal. Accordingly, the relationship level between John and
Merry may shift towards friend from professional. The shift in their relationship has been
indicated in the bottommost relationship barometer using dotted arrow, in fig. 1A.
20
[0038] Now, depending upon the scenario, in next step, the identification unit 218 may
identify a creative content (advertisement) based on the intent. For example, the creative
content may be “Nescafe” when the intent is “having a coffee”. In another embodiment,
the creative content may be “Tuborg” when the intent is “having a beer”. The identification
25 unit 218 identifies the creative content based on analysis of different parameters. For
example, different content providers may participate in bidding in providing their contents
to the mobile users. However, the identification unit 218 may select a set of bids amongst
a plurality of bids based on the intent of the user in the current conversation. For example,
the identification unit 218 may select the set of bids which provide content related to
14
“coffee”. After selecting the set of bids, the identification unit 218 analyzes the set of bids
based on a bidding criteria which may include, but not limited to, user web history,
geographic location of the user. The identification unit 218 then selects at least one bid,
amongst the set of bids, satisfying the bidding criteria. For example, the identification unit
5 218 may check the user web history and determine that the user i.e. John likes “
StarbucksTM” as has searched about “StarbucksTM” locations. Accordingly, the
identification unit 218 may select the bid related to “StarbucksTM”. In an embodiment, the
creative content provider may be different businesses may participate in advertising their
products/brands.
10
[0039] In an embodiment, the creative content may be an image of the content, a link to
website of the content provider, a link for a discount voucher for accessing/getting the
content, a link for direction/map to the location where the content is available. It should be
apparent to a person skilled in the art that the content could be represented in any other
15 forms also and the above are just few examples.
[0040] After identifying the content, the content embedding unit 220 embeds the creative
content with image of at least one of the pair of users to generate an emoji. In above
discussed example, if the creative content is “StarbucksTM”, then the content embedding
unit 220 embeds the image of John having coffee in StarbucksTM 20 mug, to generate an emoji
of John. In an alternative embodiment, the content embedding unit 220 may embed the
image of Merry having coffee in StarbucksTM mug, to generate an emoji of Merry. In
another alternative embodiment, the content embedding unit 220 may separately embed
the image of both John and Merry having coffee in StarbucksTM mug, to generate emojis
25 of John and Merry. The embedded image may be displayed on the user devices.
[0041] In an embodiment, the creative content may be either stored in the content database
stored in the memory (208) or may be received from a database external to the system 202
This way, the present disclosure provides real world experience to the users by creating an
15
emoji by embedding the image of the user with a creative content related to the intent of
the user.
[0042] Figure 3 depicts a method 300 for embedding a creative content with an image, in
5 accordance with an embodiment of the present disclosure. As illustrated in figure 3, the
method 300 includes one or more blocks illustrating a method for embedding a creative
content with an image. The method 300 may be described in the general context of
computer executable instructions. Generally, computer executable instructions can include
routines, programs, objects, components, data structures, procedures, modules, and
10 functions, which perform specific functions or implement specific abstract data types.
[0043] The order in which the method 300 is described is not intended to be construed as
a limitation, and any number of the described method blocks can be combined in any order
to implement the method. Additionally, individual blocks may be deleted from the methods
15 without departing from the spirit and scope of the subject matter described.
[0044] At block 302, the method 300 may include monitoring a current conversation
between a pair of users of a plurality of users.
20 [0045] At block 304, the method 300 may include determining current conversation data
based on the monitoring of the current conversation, wherein the current conversation data
comprises messages and corresponding responses captured during the current
conversation.
25 [0046] At block 306, the method 300 may include applying a Natural language processing
(NLP) technique upon the current conversation data to determine a current context of the
current conversation happening between the pair of users.
16
[0047] At block 308, the method 300 may include applying pre-learnt relationship model
upon the current context to determine an intent of the current conversation between the pair
of users, wherein the pre-learnt relationship model indicates a relationship between one
user with another user of the pair of users.
5
[0048] At block 310, the method 300 may include identifying a creative content based on
the intent.
[0049] At block 312, the method 300 may include embedding the creative content with
10 image of at least one of the pair of users to generate an emoji.
[0050] A description of an embodiment with several components in communication with
each other does not imply that all such components are required. On the contrary, a variety
of optional components are described to illustrate the wide variety of possible embodiments
15 of the invention.
[0051] When a single device or article is described herein, it will be clear that more than
one device/article (whether they cooperate) may be used in place of a single device/article.
Similarly, where more than one device or article is described herein (whether they
20 cooperate), it will be clear that a single device/article may be used in place of the more than
one device or article or a different number of devices/articles may be used instead of the
shown number of devices or programs. The functionality and/or the features of a device
may be alternatively embodied by one or more other devices which are not explicitly
described as having such functionality/features. Thus, other embodiments of the invention
25 need not include the device itself.
[0052] Finally, the language used in the specification has been principally selected for
readability and instructional purposes, and it may not have been selected to delineate or
circumscribe the inventive subject matter. It is therefore intended that the scope of the
17
invention be limited not by this detailed description, but rather by any claims that issue on
an application based here on. Accordingly, the embodiments of the present invention are
intended to be illustrative, but not limiting, of the scope of the invention, which is set forth
in the following claims.
5
[0053] While various aspects and embodiments have been disclosed herein, other aspects
and embodiments will be apparent to those skilled in the art. The various aspects and
embodiments disclosed herein are for purposes of illustration and are not intended to be
limiting, with the true scope and spirit being indicated by the following claims.
10
[0054] The illustrated steps are set out to explain the exemplary embodiments shown, and
it should be anticipated that ongoing technological development will change the manner in
which particular functions are performed. These examples are presented herein for
purposes of illustration, and not limitation. Further, the boundaries of the functional
15 building blocks have been arbitrarily defined herein for the convenience of the description.
Alternative boundaries can be defined so long as the specified functions and relationships
thereof are appropriately performed. Alternatives (including equivalents, extensions,
variations, deviations, etc., of those described herein) will be apparent to persons skilled in
the relevant art(s) based on the teachings contained herein. Such alternatives fall within the
20 scope and spirit of the disclosed embodiments. It must also be noted that as used herein
and in the appended claims, the singular forms “a,” “an,” and “the” include plural
references unless the context clearly dictates otherwise.
[0055] Advantages of the embodiment of the present disclosure are illustrated herein:
25 1. Correctly determining intent of a user in a conversation
2. Providing real world experience for the users
3. Helping a user in making decision as to whether to go to with a particular product

We Claim:
1. A method (300) of embedding a creative content with an image, the method
comprising:
5 monitoring (302) a current conversation between a pair of users of a plurality of users;
determining (304) current conversation data based on the monitoring of the current
conversation, wherein the current conversation data comprises messages and
corresponding responses captured during the current conversation;
applying (306) a Natural language processing (NLP) technique upon the current
10 conversation data to determine a current context of the current conversation happening
between the pair of users;
applying (308) pre-learnt relationship model upon the current context to determine an
intent of the current conversation between the pair of users, wherein the pre-learnt
relationship model indicates a relationship between one user with another user of the pair
15 of users;
identifying (310) a creative content based on the intent; and
embedding (312) the creative content with image of at least one of the pair of users to
generate an emoji.
20 2. The method (300) as claimed in claim 1, wherein the pre-learnt relationship model
is generated by:
receiving a plurality of user data corresponding to the plurality of users;
monitoring a plurality of past conversation data for a predefined time interval, such that
each past conversation data corresponds to the pair of users of the plurality of users;
25 applying the Natural language processing (NLP) technique upon the past conversation data
to determine a context of each instance of the past conversation happened between the pair
of users; and
analysing the context relative to the user data of the pair of users.
20
3. The method (300) as claimed in claim 2, wherein each user data comprises at least
one of age, name, gender, demographic, and profession associated with each user.
4. The method (300) as claimed in claim 2, wherein each past conversation data
5 comprises at least one of textual data, audio data, video audio, and graphical data, and
wherein each past conversation data indicates multiple instances of past conversation
between the pair of users during the predefined time interval.
5. The method (300) as claimed in claim 1, wherein identifying the creative content
10 further comprises:
selecting a set of bids amongst a plurality of bids based on the intent of the user in the
current conversation;
analyzing the set of bids based on a bidding criteria comprising at least one of user web
history, geographic location of the user; and
15 selecting at least one bid, amongst the set of bids, satisfying the bidding criteria, wherein
the creative content is identified corresponding to the selected at least one bid.
6. A system (202) for embedding a creative content with an image, the system
comprising:
20 monitoring unit (212) to monitor a current conversation between a pair of users of the
plurality of users;
determining unit (214) to determine current conversation data based on monitoring of the
current conversation, wherein the current conversation data comprises messages and
corresponding responses captured during the current conversation;
25 relationship generating unit (216) to apply a Natural language processing (NLP) technique
upon the current conversation data to determine a current context of the current
conversation happening between the pair of users;
the relationship generating unit (216) to apply pre-learnt relationship model upon the
current context to determine an intent of the current conversation between the pair of users,
21
wherein the pre-learnt relationship model indicates a relationship between one user with
another user of the pair of users;
identifying unit (218) to identify a creative content based on the intent; and
content embedding unit (220) to embed the creative content with image of at least one of
5 the pair of users to generate an emoji.
7. The system (202) as claimed in claim 7, wherein the relationship generating unit
(216) generates the pre-learnt relationship model by:
receiving a plurality of user data corresponding to the plurality of users;
10 monitoring a plurality of past conversation data, for a predefined time interval, such that
each past conversation data corresponds to the pair of users of the plurality of users;
applying the Natural language processing (NLP) technique upon the past conversation data
to determine a context of each instance of the past conversation happened between the pair
of users; and
15 analysing the context relative to the user data of the pair of users.
8. The system (202) as claimed in claim 8, wherein each user data comprises at least
one of age, name, gender, demographic, and profession associated with each user.
20 9. The system (202) as claimed in claim 8, wherein each past conversation data
comprises at least one of textual data, audio data, video audio, and graphical data, and
wherein each past conversation data indicates multiple instances of past conversation
between the pair of users during the predefined time interval.
25 10. The system (202) as claimed in claim 7, wherein the identifying unit (210) identifies
the creative content by:
selecting a set of bids amongst a plurality of bids based on the intent of the user in the
current conversation;
22
analyzing the set of bids based on a bidding criteria comprising at least one of user web
history, geographic location of the user; and
selecting at least one bid, amongst the set of bids, satisfying the bidding criteria, wherein
the creative content is identified corresponding to the selected at least one bid.

Documents

Application Documents

# Name Date
1 202011000466-STATEMENT OF UNDERTAKING (FORM 3) [06-01-2020(online)].pdf 2020-01-06
2 202011000466-PROVISIONAL SPECIFICATION [06-01-2020(online)].pdf 2020-01-06
3 202011000466-FORM 1 [06-01-2020(online)].pdf 2020-01-06
4 202011000466-DRAWINGS [06-01-2020(online)].pdf 2020-01-06
5 202011000466-DECLARATION OF INVENTORSHIP (FORM 5) [06-01-2020(online)].pdf 2020-01-06
6 202011000466-FORM-26 [08-01-2020(online)].pdf 2020-01-08
7 abstract.jpg 2020-01-17
8 202011000466-Proof of Right [07-02-2020(online)].pdf 2020-02-07
9 202011000466-DRAWING [04-01-2021(online)].pdf 2021-01-04
10 202011000466-CORRESPONDENCE-OTHERS [04-01-2021(online)].pdf 2021-01-04
11 202011000466-COMPLETE SPECIFICATION [04-01-2021(online)].pdf 2021-01-04
12 202011000466-RELEVANT DOCUMENTS [16-03-2021(online)].pdf 2021-03-16
13 202011000466-FORM 13 [16-03-2021(online)].pdf 2021-03-16
14 202011000466-FORM 18 [26-10-2023(online)].pdf 2023-10-26
15 202011000466-FER.pdf 2025-04-01
16 202011000466-FORM 3 [23-05-2025(online)].pdf 2025-05-23
17 202011000466-OTHERS [01-10-2025(online)].pdf 2025-10-01
18 202011000466-FORM-26 [01-10-2025(online)].pdf 2025-10-01
19 202011000466-FER_SER_REPLY [01-10-2025(online)].pdf 2025-10-01
20 202011000466-DRAWING [01-10-2025(online)].pdf 2025-10-01
21 202011000466-CLAIMS [01-10-2025(online)].pdf 2025-10-01

Search Strategy

1 SearchHistoryE_11-06-2024.pdf