Sign In to Follow Application
View All Documents & Correspondence

System And Method For Robot Initiated Personalised Conversation With A User

Abstract: SYSTEM AND METHOD FOR ROBOT INITIATED PERSONALISED CONVERSATION WITH A USER There is provided a method that includes (i) detecting, using sensor units (111) of a robot (108), at least one of environmental parameters, an environment event, a period of the environment event, an environment alert, a time of an event related to the user (102), personal events related to the user (102), user’s relatives or friends or acquaintances, an outdoor environment location, an apparel of the user (102), audio events or visual events, or news, based on surrounding and proclivity of the user (102); and (ii) conversing or interacting with the user (102) based on a conversation topic related to detected environmental parameters, environment event, period of the environment event, environment alert, time of the event related to the user (102), personal events related to the user (102), user’s relatives or friends or acquaintances, outdoor environment location, apparel of the user (102), audio events or visual events, or news. FIG. 8

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 March 2021
Publication Number
39/2022
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
ipo@myipstrategy.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-11-14
Renewal Date

Applicants

RN CHIDAKASHI TECHNOLOGIES PVT. LTD
#4, STAMBHTIRTH, PLOT #82, R. A. KIDWAI ROAD, WADALA (WEST). MUMBAI – 400031, MAHARASHTRA, INDIA

Inventors

1. Mr. Prashant Iyengar
202, Romell Umiya Grandeur, Vishweshwar Road, Near Udipi Vihar Hotel, Goregaon East,Mumbai, Maharashtra, India - 400063
2. Mr. Hardik Godara
B1-504, Victorian Palace, Kheme-Ka-Kuwa, Near AIIMS Road, Jodhpur, Rajasthan, India - 342008

Specification

DESC:
17
environment parameters to determine whether the environment events occurred for a first time
or occurs abnormally
.
T
he anomaly detection may detect environment parameters such as first
instance of rain, snow, hail, spring, fog, or an occurrence of storms, cyclones or tsunami. The
conversation knowledge base 110 stores and locates the information when required to initiate
5
a conversation based on environment. For example, during a first in
stance of rains, the robot
108
may
initiate a conversation based on rainy season.
The anomaly detection module 206
detects a
n anomaly in the event of more than average rainfall occurrence in a stipulated amount
of time and the robot 108 initiates a conversation on high rainfall.
For example, the robot 108
says to user 102 “Heavy Rain this month. It may lead to flood as like 201
6 flood. I am worried
10
about the flood.”
[0070]
The robot 108 detects that rainy season has ended by identifying that no more
rainfall occurrence for a significant duration of time and initiates conversation with the user on
ending of rainy season.
The warning mod
ule 208 issues alerts from the weather tracking unit
112 regarding fog, cyclones, storms, tsunami, and the robot 108 initiates conversation with user
15
102
regarding the environmental alert
.
The environmental alert may include alert about fog,
cyclones, stor
ms, tsunami, flood, earthquakes, landslides, volcanic activity, extreme
temperatures, drought or wildfires.
The identified environmental events may be used as
conversational topics ontology related between the robot 108 and the user 102.
For example,
the r
obot 108 says to user 102 “Hurry up. I have seen a flash news just now regarding
20
earthquake. Let’s prepare ourself to face the earthquake.”
[0071]
In some embodiments, the robot 108 detects at least one of environment
parameters, environment events, the period of
the environmental event, anomaly environment
event, or the environment alert that belongs to a location interest of the user 102, and initiates
conversation with the user 102 related to the detected environment parameters, events, alerts,
25
period, anomaly
event that belongs to the location interest of the user 102. The location interest
of the user 102 may be any place, state, or country.
For example, the robot 108 says to user 102
“Heavy Rain this month in your friend’s place. It may lead to flood as like
2016 flood. I am
worried about the flood.”
[0072]
FIG
. 3 is a block diagram that illustrates
the
robot
108
of FIG. 1 for initiating a
30
personalized conversation with the user
102
based on
an
event
and event time
related to the
user
102
,
according to some embodimen
ts herein
.
The
robot 108 includes the
conversation
knowledge base 110
includes conversation topics related to events and
event time
,
an
identification module 302,
a
tracking module 304 and
a
calendar module 306.
The
18
identification module 302 may use identi
fication tools to verify the authenticity of the user 102.
In an embodiment,
the identification tool may consist of face recognition software, pin code
software, finger print detection and so on.
[0073]
The identification module 302 may include an imaging unit, or a hearing unit
5
such as microphone.
T
he tracking module 30
4
may use
at least one of the
GPS unit 114
, WIFI
access points, Bluetooth access points, or any indoor tracking unit
to track and identif
y a
location of the user 102.
The robot 108 identifies a time of the day while the robot 108 is
locating the user 102 during the day. For example
, if the
robot
108 locates
the
user 102 for a
first time during
the
day, the
robot
108 initiates a conversation
by greeting the user 102 based
10
on the time of the day.
For example, based on the user 102 surrounding time of the day the
robot
108 may wish good morning or good afternoon and so on.
For example, the robot 108
says to user 102 “Good morning” if the robot
108 meets the user 102 first time for the day in
morning.
The
robot
108 updates information
in the conversation knowledge base 110
based on
the input received from the calendar module 306.
15
[0074]
The calendar module 306 detects at least one of, the personal even
ts related to
the user 102 on the day or in the upcoming days, or the personal events related to user’s
relatives or friends or acquaintances on the day or in the upcoming days. The personal events
may include, but not limited to, birthdays, wedding, anniv
ersaries, family events, holidays,
festivals, scheduled meetings or events, public holidays, bank holidays, or preference of the
20
user 102.
The calendar module 306 identifies and stores user 102 information regarding
an
age,
a
date of birth
, anniversaries a
nd so on. In an example the
robot
108 may wish
or
greet the user
102 with birthday wishes and initiate a conversation regarding user’s 102 birthday
, if the
calendar module 306 identifies that today is the birthday of the user 102
.
For example, the robot
10
8 says to user 102 “Many more returns of the day” “Last year, we have celebrated your
25
birthday in Taj hotel.
But
we did not celebrate your birthday this year due to pandemic
situation

.
However, you denoted some amount to charities which is a wonderful act
ivity.”
The
calendar module 306 also identifies key milestone birthdays such as 16
years, 18
years, 25
years and so on. Additionally, the calendar module 306 also identifies and stores information
regarding user’s 102 relatives, friends and acquaintances.
The
robot
108
may
locate the
30
information and send a reminder to the user 102 regarding the events stored in the calendar
module 306.
In an example, the robot 108 locates and detects the user 102 on or prior to date
of the personal events and initiates a co
nversation with the user to remind the user 102 about
the personal events. The robot
108
may locate the user’s personal information regarding any
19
events and send a reminder to the user 102 regarding the events stored in the calendar module
3
06. In an example, the robot 108 may locate information regarding friend’s age and date of
birth and remind the user 102 to send birthday wishes to the friend.
In an embodiment, the
calendar module 306 stores information regarding dates of festivals, publi
c holidays, bank
5
holidays, anniversaries of preference of the user 102 and the
robot
108
may
initiate a
conversation regarding the same.
For example, the robot 108 says to user 102 “Hurray, today
is public holiday. So, I can spend more time with you.”
[0075]
FIG.
4 is a block diagram that illustrates
the
robot
108
of FIG. 1 for initiating a
personalized conversation with the user
102
based on outdoor environment location,
according
10
to some embodiments herein
. The
robot 108 includes the
conversation knowledge base
110
that stores conversation topics related to the
on outdoor environment
,
an outdoor environment
sensing
module 402,
an
audio
-
visual
module 404 and
a
location detection
module 406. The
outdoor environment
sensing
module 402
is
activate
d
only on the identi
fication of an outdoor
location surrounding the user 102. In an embodiment, the outdoor environment
sensing
module
15
402 detects an outdoor environment location if the user 102 and
robot
108 are inside an
automobile, inside a mass transportation vehicle, pub
lic location such as a museum, restaurant,
amusement parks, religious places and so on, open spaces such as beaches, forests, hiking trails
and so on.
The location detection module 406 may use a GPS to locate the geographical
location of the user 102. The
robot 108 identifies the exact outdoor location of the user 102 by
20
extracting relevant information from conversation knowledge base 110 based on the outdoor
environment location. The robot 108 uses a machine learning algorithm to identify and classify
the
outdoor environment location. The robot 10
8
may initiate, using a conversation module
506, a conversation related to the identified outdoor environment location.
The audio
-
visual
module 404 records any conversation, pictures, videos, multimedia events
in the outdoor
25
environment location
for the
robot
108 to interact with the user 102.
The outdoor environment
sensing module 402 detects at least one of an entity, a parameter, a property or a monument
associated with the outdoor environment location.
In a
n example, the
robot
108 identifies the
user 102 location near Statue of Liberty and initiates a conversation on the Statue of Liberty
using the information gathered from conversation knowledge base 110.
For example, the robot
30
108 says to the user 102 “ do
you know Statue of Liberty was a gift of the French people to the
Americans on the commemoration of American Independence on July 4, 1776?”
[0076]
FIG. 5 is a block diagram that illustrates
the
robot
108
of FIG. 1 for initiating a
personalized conversation with
the user
102
based on user apparel,
according to some
20
embodiments herein
.
The
robot 108 includes the
conversation knowledge base 110
that stores
conversation topics related to
apparel
s
,
an
identification module 502,
an
apparel detection
module 504,
a
recom
mendation module 506, sensor module 508,
and
a
social media module
510. The
robot
108 performs user detection and identification using
the
identification module
5
502. The user 102 is associated to a specific range of apparel features.
The apparel detection
module
504 (i
) tracks an apparel of the user 102 in a home environment of the user 102 daily,
(ii) identifies features and properties of daily apparels of the user 102 based on the tracked
apparels, and (iii) detects at least one of a favorite apparel, a favorite color
, a favorite apparel
type, properties of the favorite apparel, a new apparel, an apparel preference of the user 102 or
10
a pattern of usage of the apparels based on the tracked apparels and the properties of daily
apparels. The robot 108 determines a convers
ation topic related to at least one of the favorite
apparel, the favorite color, the favorite apparel type, the properties of the favorite apparel, the
new apparel, the apparel preference of the user 102, or the pattern of usage of the apparels, and
initia
tes, using the personalized conversation with the user 102 based on the determined
15
conversation topic.
For example, the robot 108 says to the user 102 “you are looking beautiful
in this outfit.”
For example, the robot 108 says to the user 102 “you are wear
ing your favorite
dress.”
The recommendation module 506 suggests the user 102 apparel type based on the
information previously stored.
For example, the robot 108 says “ I have seen few new arrivals
in online shopping market. I feel that new arrivals will s
uits to you.”
However, the
robot
108
20
may not limit the suggestions based on the information history. The sensor module 508
identifies a new apparel worn by the user 102 and the
robot
108 may initiate a conversation
with the user
102
regarding the
new appar
el
. The sensor module 508 may also identify an
apparel being worn by the user 102 after a long time
and the robot 108 initiates a conversation
with the user 102 regarding the apparel that is worn after long time.
For example, the robot 108
25
says to the user
102 “you are wearing this after long time. You have worn this dress last year
for your sister’s birthday
.

[0077]
The social media module 510 detects and stores shopping habits of
the
user 102
and the
robot
108 may initiate a conversation with the user
102
regar
ding shopping of apparels.
The
social media module 510
monitors shopping habits of the user 102 by accessing purchase
30
and online ecommerce browsing history of the user 102. For example, the
social media module
510
extracts a type of apparel purchased, an a
mount spent for purchasing, or no. of apparel
purchased.
The
robot 10
8
may initiate
a conversation with the user 102 regarding shopping of
apparels. The
social media module 510
may provide relevant information to the robot 10
8
with
21
personalized information
regarding the user purchase and browsing e
-
commerce history of the
user 102.
The robot 108 may be equipped with personalized information regarding the user 102
purchase and browsing e
-
commerce history of the user 102. The robot 108 may also provide
person
alized recommendation
s
, suggestions for new apparel types, colors of apparel based on
5
apparel preference of the user 102. The robot 108 may be equipped to showcase tracks of
popular personalities, people and social media handles or links followed by the us
er.
[0078]
The robot 108 may
associate features and properties of the favorite apparel and
daily apparel of the user 102 and shopping habits to
one or more
apparel recommendations for
e
-
commerce purposes as personalized suggestions to the user 102.
10
[0079]
The recommendation module
5
06 provides
one or more
personalized
recommendation
s
of apparels based on at least one of features and properties of daily apparels,
features and properties of the favorite apparel, the apparel preference or the pattern of usage
of
the apparels, or shopping habits. The robot 10
8
may initiat
e
a conversation with the user 102
based on the
one or more
apparel recommendations. The recommendation module
5
06 suggests
15
the user 102 an apparel type based on the information previously stored. However, the robot
108
may not limit the suggestions based on the information history. The recommendation
module
5
06 may also provide personalized recommendation
s
, suggest
ions for new apparel
types, colors of apparel based on
the
apparel preference of the user 102.
[0080]
The social media module 510 (i) tracks at least one of popular personalities,
20
people and handles or links followed by the user 102, (ii) identifies if anyone els
e in a social
media network of
the
user 102 or handles followed by the user 102 has worn apparel that is
similar to the apparel of the user 102 or has worn apparel that
includes
same type, color,
combination as similar to the apparel of the user 102, and (
iii) initiates a conversation with the
user 102 to provide information that someone else in the social media network of
the
user 102
25
or handles followed by the user 102 has worn the apparel that is similar to the apparel of the
user 102 or has worn the app
arel that
includes
same type, color, combination as similar to the
apparel of the user 102. The robot 108 may
include
information to showcase to the user 102
some tracks of popular personalities, people and social media handles or links followed by the
use
r.
For example, the robot 108 says to the user 102 “your favorite actor too wears a dress that
30
is similar to the dress
with you
.
That is so cool!

[0081]
FIG. 6 is a block diagram that illustrates
the
robot
108
of FIG. 1 for initiating a
personalized conversation
with the user
102
based on audio events or visual events,
according
22
to some embodiments herein
. The
robot 108 includes the
conversation knowledge base 110
that store conversation topics related to
on audio visual events
,
an auditory module 602,
a
visual
m
odule 604,
a
music detection module 606 and
a
recommendation module 608. The auditory
module 602
and visual module 604 record and classify audio
events and
visual events using
5
auditory sensors and audio
-
visual sensor. The
robot
108 may
be equipped with
modules to
classify if
the
user 102 is singing and to identify the song being sung by the user 102. The
robot
108 may then play the song on a speaker system for the user 102 to sing or dance along. The
music detection module 606 may identify and classify t
he songs based on artists, albums, genre
and so on.
10
[0082]
The recommendation module 608 may determine the user’s 102 favorite music
genre, artist or album and suggest
music recommendation based on the same. The
robot
108
interacts with the user 102 based on the
user 102 information received from the auditory
module 602,
the
visual module 604,
the
music detection module 606 and
the
recommendation
module 608. In the event when no music is played, the
robot
108 may initiate a conversation
15
with the user 102 to play
music of
user
preference. In an example, the
robot
108 may suggest
music based on user’s 102 favorite song, artist, songs belonging to a same and similar album,
songs from the same genre, new songs of preferred artist and or genre. The
auditory
module
602
detects
the
user 102 based on any speaking audio voice of the user 102 by using audio
localization and locomotion capabilities.
T
he
robot
108 may turn in the direction of the user
20
102 and initiate a conversation with the user 102.
Further, in the
auditory
module 602, the audio
is detected and recorded in a security mode. The security mode performs audio activity
detection. On detection of audio activity, a security alert
regarding the detection of audio
activity of others
will be sent to the user 102. In an
embodiment, the security alert may be sent
via a wireless or wired internet connection or any other telecommunication medium.
25
[0083]
The
auditory
module 602, may also detect the user 102 playing
a musical
instrument, upon which the
robot
108 may play supportive
music using generative deep neural
network technologies.
The visual module 604 detects and records
the
user 102 behavior and
activity. In an example the visual module 604 may identify
the
user 102 performing workout
at home and the
robot
108 initiates a c
onversation based on fitness, the
robot
108 may also
30
suggest and play workout music during the workout routine of the user 102.
In another
example, the visual module 604 identifies that the user 102 is dancing and the robot 108
initiates a conversation to
dance and play music. The visual module 604 may identify if the
user 102 is entering or leaving the house and initiate a conversation corresponding to user entry
23
or exit event. The visual module 604 may perform video anomaly detection if any video
anomaly
is detected, then the robot 108 initiates a conversation
with
the user 102 via
the
internet or other telecommunication medium to raise an alert.
The robot 108 may be equipped
with
a
video or image categorization module to identify devices and equipment in
the
user’s
5
home such as television (TV), music player, fan, air conditioner (AC) and initiate a
conversation with the user 102 based on the information received from the video or image
categorization module.
[0084]
FIG. 7 is a block diagram that illustrates the robot
108
of FIG. 1 for initiating a
personalized conversation with the user
102
based on news,
according to some embodiments
10
herein
. The
robot 108 includes the
conversation knowledge base 110
that includes on
e or more
conversation topics related to news events
,
a
news analysis module 702,
an
entity identification
module 704,
a
location detection module 706 and
a
pattern detection module 708. The news
analysis module 702 is equipped to fetch and record news fro
m various news sources of interest
to the user 110. The entity identification module 704 identifies the key entities from news
15
headlines and the
robot
108 initiates a conversation with the user 102 on key entities in the
news headline. The entity identific
ation module 704 further identifies the entities and class of
entities from the news headlines and news content. In an example, if Mumbai
is an entity, city
will be the entity class. The
robot
108 monitors the entities and entity class of
the
user
’s
102
pr
eference and initiates a conversation with the user 102. The location detection module 706
20
detects the location of the user for the
robot
108 to initiate conversation based on the same. In
an example, the
robot
108 may initiate a conversation with the user
102 if the news contains
news corresponding to the location of the residence,
workspace
, sports events
,
and so on of the
user 102. The pattern detection module 708 detects patterns of daily news and events consumed
by the user 102. The
robot
uses the info
rmation from the pattern detection module 708 to
25
recommend the type of news to be showcased to the user 102.
[0085]
FIG. 8 is a block diagram that illustrates the robot
108
of FIG. 1 for initiating a
personalized conversation with the user
102
based on entity ont
ology,
according to some
embodiments herein
.
The robot 10
8
includes the conversation knowledge base 110 and an
entity ontology detecting module 802. The entity ontology detecting module 802 is divided
30
into a sibling entity detecting module 804, a parent en
tity detecting module 806, a child entity
detecting module 808, a super parent entity detecting module 810, and a super a child entity
detecting module 812. The entity ontology detecting module 802 is a hierarchical
representation of entities
where entitie
s comprise the sibling entity, the sibling entity class, the
24
child entity, the child entity class, the parent entity, the parent entity class, the super parent
entity, the super parent entity class, the super child entity and the super child entity class
.
The
robot 108, using the entity ontology detecting module 802, (i) identifies at least one of entity
or entity class in at least one of the detected environmental parameters, the environment event,
5
the period of the environment event, the environment alert
, the time of the event related to the
user 102, the personal events related to the user 102, the personal events related to user’s
relatives or friends or acquaintances, the outdoor environment location, the apparel of the user
102, the audio events surro
unding the user environment, the visual events surrounding the user
environment, or the news, (ii) determines a conversation topic related to the identified entity
10
or entity class or other entity or entity instance or other entity class than the identified
entity or
entity class from the conversation knowledge base 1
10
utilizing a conversational
personalization engine, and (iii) initiates the personalized conversation with the user 102 based
on the determined conversation topic.
[0086]
The entity ontology detectin
g module 802 identifies
at least one
entity or entity
15
class in a human robot interaction (HRI) during
the
personalized conversation. The sibling
entity detecting module 804 detects a sibling entity or a sibling entity class based on the
identified
entity or entity class in the HRI. The parent entity detecting module 806 detects a
parent entity, or a parent entity class based on the
identified
entity or entity class in the HRI.
The child entity detecting module 808 detects a child entity, or a child
entity class based on the
20
identified
entity or entity class in the HRI. The super parent entity detecting module 810 detects
a super parent entity, or a super parent entity class
belonging to the parent entity or the parent
entity class
. The super child en
tity detecting module 812 detects a super child entity, or a super
child entity class
belonging to the child entity, or the child entity class
. The entity ontology
detecting module 802 identifies properties of the entity, or properties of the entity class.
The
25
robot 108 determines
at least one
conversation topic, in the conversation knowledge base 110
utilizing the conversational personalization engine, related to the at least one of sibling entity
class, sibling entity, child entity, child entity class, pa
rent entity, parent entity class, super
parent entity, super parent entity class, super child entity, super child entity class, properties of
the entity, or properties of the entity class
based on one or more parameters of user interest
and
30
initiates, usin
g a conversation module, the personalized conversation with the user 102 based
on the determined conversation topic.
[0087]
In one exemplary embodiment, let consider the entity
is a
place. Then, in the
entity ontology in the conversation knowledge base
110
,
place
is the super parent
entity
,
25
continent is the parent
entity
,
c
ountry is the super child
entity
and state is the child
entity
and
the
c
ity
may
be sibling
entity
. For instance
,
if the robot
108
initiates conversations
,
the robot
108
may
initiate
conversation
based on
a
current entity class it had made the conversation with
the user
102.
5
[0088]
For
example,
if
a
current sibling entity was Mumbai
,
the later conversations
may
be based on other sibling entities like Delhi, Chennai etc. based on other parameters of
inter
est with respect to
the
user
102
.
[0089]
Similarly,
if the current conversation with the user
102
belongs to the superchild
entity, wherein the superchild
entity
is Country then the subsequent conversations which are
10
initiated by the
r
obot
108 may
belong to the same superchild class also taking into consideration
the personal preferences of the user
102
.
[0090]
In one embodiment
,
the robot
108
interacts with the user
102
based on the user's
102
routine to listening to music, the system
100
determines the
entity class of the current
conversation and then interacts with the user
102
taking into consideration the specific entity
15
class of the previous conversation along with other personal references of the user
102
.
[0091]
In one embodiment
,
the robot
108
interacts
with the user
102
based on the
environmental parameters around the user
102
and the entity class of the current conversation
and then interacts with the user
102
taking into consideration the specific entity class of the
previous conversation along with o
ther personal references of the user
102
.
20
[0092]
FIG
S
. 9
A and 9B are
flow diagram
s
that illustrate a method for a robot
-
initiated
personalized conversation with a user
102
, based on surrounding and proclivity of a user,
according to some embodiments herein
.
The robot 108 initiates a personalized conversation
with the user 102 based on the surrounding and proclivity of the user 102.
At step
9
02,
at least
one of environmental parameters,
to determine
an environment event, a period of the
25
environment event, an e
nvironment alert,
and an anomaly environment event by processing the
detected environmental parameters,
a time of an event related to the user 102, personal events
related to the user 102, personal events related to user’s relatives or friends or acquainta
nces,
an outdoor environment location, an apparel of the user 102, audio events, visual events, or
news,
are detected using one or more sensor units 111
i
n the surrounding and proclivity of the
30
user 102
.
At step
9
04,
a
t least one
conversation topic in a conversation knowledge base 110 is
determined by the robot 108 utilizing a conversational personalization engine
that is related to
detected environmental parameters, the environment event, the period of the environment
26
event, the
environment alert, the time of the event related to the user 102, the personal events
related to the user 102, the personal events related to user’s relatives or friends or
acquaintances, the outdoor environment location, the apparel of the user 102, the a
udio events,
the visual events, or the news. At step 906, converse
d
or interact
ed
with the user 102
by the
5
robot 108
using an output unit
by
converting the at least one conversation topic into at least
one output action and performing the at least one outp
ut action to converse or interact with the
user 102
. At step 908, an entity or entity class in a human robot interaction (HRI)
,
during the
personalized conversation
, is identified by the robot 108
by processing a conversation content
during personalized co
nversation that is uttered by the robot 108 or the user 102 and identifying
10
the entity or entity class in the conversation content, the entity being a piece of word in the
. At
step 910, converse
d
or interact
ed
with the user 102
by the robot 108
using the o
utput unit
by
converting the at least one conversation topic related to the entity or entity class into at least
one output action and performing the at least one output action to converse or interact with the
user 102
.
15
[0093]
The embodiments herein may include a
computer program product configured
to include a pre
-
configured set of instructions, which when performed, can result in actions as
stated in conjunction with the methods described above. In an example, the pre
-
configured set
of instructions can be stored
on a tangible non
-
transitory computer readable medium or a
program storage device. In an example, the tangible non
-
transitory computer readable medium
20
can be configured to include the set of instructions, which when performed by a device, can
cause the de
vice to perform acts similar to the ones described here. Embodiments herein may
also include tangible and/or non
-
transitory computer
-
readable storage media for carrying or
having computer executable instructions or data structures stored thereon.
[0094]
Generally
, program modules utilized herein include routines, programs,
25
components, data structures, objects, and the functions inherent in the design of special
-
purpose
processors, etc. that perform particular tasks or implement particular abstract data types.
Comp
uter executable instructions, associated data structures, and program modules represent
examples of the program code means for executing steps of the methods disclosed herein. The
particular sequence of such executable instructions or associated data struc
tures represents
30
examples of corresponding acts for implementing the functions described in such steps.
[0095]
The embodiments herein can include both hardware and software elements. The
embodiments that are implemented in software include but are not limited to,
firmware,
resident software, microcode, etc.
27
[0096]
A data processing system suitable for storing and/or executing program code
will include at least one processor coupled directly or indirectly to memory elements through
a system bus. The memory elements can in
clude local memory employed during actual
execution of the program code, bulk storage, and cache memories which provide temporary
5
storage of at least some program code in order to reduce the number of times code must be
retrieved from bulk storage during e
xecution.
[0097]
Input/output (I/O) devices (including but not limited to keyboards, displays,
pointing devices, etc.) can be coupled to the system either directly or through intervening I/O
controllers. Network adapters may also be coupled to the system to enabl
e the data processing
10
system to become coupled to other data processing systems or remote printers or storage
devices through intervening private or public networks. Modems, cable modem and Ethernet
cards are just a few of the currently available types of
network adapters.
[0098]
A representative hardware environment for practicing the embodiments herein
is depicted in FIG.
10
with reference to FIGS. 1 through
9
. This schematic drawing illustrates
15
a hardware configuration of
the
robot
10
8
/computer system/ computi
ng device in accordance
with the embodiments herein. The system includes at least one processing device CPU 10 that
may be interconnected via system bus 15 to various devices such as a random access memory
(RAM) 12, read
-
only memory (ROM) 16, and an input/
output (I/O) adapter 18. The I/O adapter
18 can connect to peripheral devices, such as disk units 58 and program storage devices 50 that
20
are readable by the system. The system can read the inventive instructions on the program
storage devices 50 and follow
these instructions to execute the methodology of the
embodiments herein. The system further includes a user interface adapter 22 that connects a
keyboard 28, mouse 50, speaker 52, microphone 55, and/or other user interface devices such
as a touch screen d
evice (not shown) to the bus 15 to gather user input. Additionally, a
25
communication adapter 20 connects the bus 15 to a data processing network 52, and a display
adapter 25 connects the bus 15 to a display device 26, which provides a graphical user interfa
ce
(GUI) 56 of the output data in accordance with the embodiments herein, or which may be
embodied as an output device such as a monitor, printer, or transmitter, for example.
[0099]
The foregoing description of the specific embodiments will so fully reveal the
30
general nature of the embodiments herein that others can, by applying current knowledge,
readily modify and/or adapt for various applications such specific embodiments without
departing from the generic concept, and, therefore, such adaptations and modifi
cations should
and are intended to be comprehended within the meaning and range of equivalents of the
28
disclosed embodiments. It is to be understood that the phraseology or terminology employed
herein is for the purpose of description and not of limitation
. Therefore, while the embodiments
herein have been described in terms of preferred embodiments, those skilled in the art will
recognize that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims
,CLAIMS:I/We Claim:
1. A processor (130) implemented method for a robot (108) initiated personalized conversation with a user (102), based on surrounding and proclivity of the user (102), wherein the robot (108) initiates the personalized conversation with the user (102) based on the surrounding and proclivity of the user (102) that comprises the processor (130); and a memory that stores a conversation knowledge base (110) that comprises one or more conversation topic ontologies, and a set of instructions capable of being executed by the processor (130), wherein the method comprising:
detecting, using one or more sensor units (111) of the robot (108), at least one of environmental parameters thereafter to determine an environment event, a period of the environment event, and an anomaly environment event by processing the detected environmental parameters, an environment alert, a time of an event related to the user (102), personal events related to the user (102), personal events related to user’s relatives or friends or acquaintances, an outdoor environment location, an apparel of the user (102), audio events, visual events, or news, in the surrounding and proclivity of the user (102),
characterized in that the method comprises
determining, by therobot (108), at least one conversation topic in a conversation knowledge base (110) utilizing a conversational personalization engine that is related to detected environmental parameters, the environment event, the period of the environment event, the environment alert, the time of the event related to the user (102), the personal events related to the user (102), the personal events related to user’s relatives or friends or acquaintances, the outdoor environment location, the apparel of the user (102), the audio events, the visual events, or the news;
conversing or interacting, using an output unit of the robot (108), with the user (102) by converting the at least one conversation topic into at least one output action and performing the at least one output action to converse or interact with the user (102);
identifying, by the robot (108), at least one entity or entity class in a human robot interaction (HRI) during the personalized conversation by processing a conversation content during the personalized conversation that is uttered by the robot (108) or the user (102) and identifying the at least one entity or entity class in the conversation content, the at least one entity being a piece of word in the conversation content; and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by converting the at least one conversation topic related to the at least one entity or entity class that is identified into at least one output action and performing the at least one output action to converse or interact with the user (102).


2. The processor (130) implemented method as claimed in claim 1, wherein the method comprises
identifying, by the robot (108), at least one of a sibling entity, a sibling entity class, a child entity, a child entity class, a parent entity, or a parent entity class corresponding to the at least one entity that is identified in the personalized conversation using an entity ontology in the conversation knowledge base (110);
identifying, by the robot (108), at least one of a super parent entity or a super parent entity class belonging to the parent entity or the parent entity class or a super child entity or a super child entity class belonging to the child entity, or the child entity class, wherein the entity ontology is a hierarchy representation of the entities where entities comprise the sibling entity, the sibling entity class, the child entity, the child entity class, the parent entity, the parent entity class, the super parent entity, the super parent entity class, the super child entity and the super child entity class;
determining, by the robot (108), at least one conversation topic, in the conversation knowledge base (110) utilizing the conversational personalization engine, related to the at least one of sibling entity, sibling entity class, child entity, child entity class, parent entity, parent entity class, super parent entity, super parent entity class, super child entity, or super child entity class based on one or more parameters of user interest; and
conversing or interacting, using the output unit of the robot (108), with the user (102) by converting the at least one conversation topic into at least one output action and performing the at least one output action to converse or interact with the user (102).


3. The processor (130) implemented method as claimed in claim 1, wherein the method comprises initiating the personalized conversation, by the robot (108) based on an environment by:
detecting, using the one or more sensor units (111), the environment parameters in the surrounding of the user (102) or a place of interest to the user (102), wherein the robot (108) uses at least one of a weather tracking unit (112), an environment tracking service, a global positioning system (GPS) unit (114), an audio sensor (116) or a visual sensor (126) to detect the environment parameters;
determining the environment event in the surrounding of the user (102) or the place of interest to the user (102) based on the detected environment parameters;
determining a period of the environmental event by matching a pattern of the environmental event with a historical pattern of the environmental event;
determining whether the environment event occurs for a first time or occurs abnormally by performing anomaly event detection on the detected environment parameters;
determining a conversation topic related to at least one of the environment parameters, the environment event, the period of the environmental event or a first occurrence or an abnormal occurrence of the environment event, from the conversation knowledge base (110) utilizing the conversational personalization engine; and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by initiating the personalized conversation based on the determined conversation topic.


4. The processor (130) implemented method as claimed in claim 1, wherein the method comprises initiating the personalized conversation, by the robot (108) based on the environment alert by:
determining at least one environment alert related to the surrounding of the user (102) or a place of interest to the user (102) from an environment weather service, wherein the environment alert comprises at least one of fog, cyclones, storms, tsunami, flood, earthquakes, landslides, volcanic activity, extreme temperatures, drought or wildfires;
determining a conversation topic related to the at least one environment alert from the conversation knowledge base (110) utilizing the conversational personalization engine; and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by initiating the personalized conversation with the user (102) based on the determined conversation topic.


5. The processor (130) implemented method as claimed in claim 1, wherein the method comprises initiating the personalized conversation, by the robot (108) based on an event and time by:
tracking and identifying the user (102) in a day, wherein the robot (108) uses a global positioning system (GPS) unit (114) or an identification unit (128) to track and identify the user (102);
detecting at least one of a time of the day, the personal events related to the user (102) on the day or in upcoming days, the personal events related to user’s relatives or friends or acquaintances on the day in the upcoming days, or key milestones in the personal events, wherein the personal events comprise at least one of birthdays, wedding, anniversaries, family events, holidays, festivals, scheduled meetings or events or preference of the user (102);
determining a conversation topic related to at least one of the time of the day, the personal events related to the user (102) on the day or in the upcoming days, the personal events related to user’s relatives or friends or acquaintances on the day or in the upcoming days, or key milestones in the personal events, from the conversation knowledge base (110) utilizing the conversational personalization engine; and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by initiating the personalized conversation with the user (102) based on the determined conversation topic.


6. The processor (130) implemented method as claimed in claim 1, wherein the method comprises initiating the personalized conversation, by the robot (108), based on the outdoor environment location by:
identifying the outdoor environment location of the user (102);
detecting at least one of an entity, a parameter, a property or a monument associated with the outdoor environment location;
determining a conversation topic related to at least one of the entity, the parameter, the property or the monument associated with the outdoor environment location, from the conversation knowledge base (110) utilizing the conversational personalization engine; and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by initiating the personalized conversation with the user (102) based on the determined conversation topic.


7. The processor (130) implemented method as claimed in claim 1, wherein the method comprises initiating the personalized conversation, by the robot (108) based on the apparel of the user (102) by:
tracking an apparel of the user (102) in a home environment of the user (102) daily;
identifying features and properties of daily apparels of the user (102) based on the tracked apparels;
detecting at least one of a favorite apparel, a favorite color, a favorite apparel type, properties of the favorite apparel, a new apparel, an apparel preference of the user (102) or a pattern of usage of the apparels based on the tracked apparels and the properties of daily apparels;
determining a conversation topic related to at least one of the favorite apparel, the favorite color, the favorite apparel type, the properties of the favorite apparel, the new apparel, the apparel preference of the user (102), or the pattern of usage of the apparels; and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by initiating the personalized conversation with the user (102) based on the determined conversation topic.


8. The processor (130) implemented method as claimed in claim 7, wherein the method comprises
monitoring shopping habits of the user (102) by accessing purchase and online ecommerce browsing history of the user (102); and
providing a plurality of personalized recommendation of apparels based on at least one of features and properties of daily apparels, features and properties of the favorite apparel, the apparel preference or the pattern of usage of the apparels, or shopping habits of the user (102).


9. The processor (130) implemented method as claimed in claim 1, wherein the method comprises
tracking at least one of popular personalities, people and handles or links followed by the user (102);
identifying if the anyone else in a social media network of user (102) or handles followed by the user (102) has worn an apparel that is similar to the apparel of the user (102) or has worn an apparel that comprises same type, color, combination as similar to the apparel of the user (102); and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by initiating a conversation with the user (102) to provide an information that someone else in the social media network of user (102) or handles followed by the user (102) has worn the apparel that is similar to the apparel of the user (102) or has worn the apparel that comprises same type, color, combination as similar to the apparel of the user (102).


10. The processor (130) implemented method as claimed in claim 1, wherein the method comprises initiating the personalized conversation, by the robot (108), based on the news by:
fetching news from a news source of interest of the user (102);
identifying at least one of key entities or entity class in news headlines or news content of the news;
determining a conversation topic related to at least one of identified key entities or entity class in the news headlines or the news content; and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by initiating the personalized conversation with the user (102) based on the determined conversation topic.


11. The processor (130) implemented method as claimed in claim 1, wherein the method comprises initiating the personalized conversation, by the robot (108), based on the audio events or the visual events by:
classifying at least one audio event or at least one visual event related to the user (102); and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by initiating an interaction with the user (102) based on the classified at least one audio event or video event.


12. The processor (130) implemented method as claimed in claim 1, wherein method comprises
identifying at least one of entity or entity class in at least one of the detected environmental parameters, the environment event, the period of the environment event, the environment alert, the time of the event related to the user (102), the personal events related to the user (102), the personal events related to user’s relatives or friends or acquaintances, the outdoor environment location, the apparel of the user (102), the audio events, the visual events, or the news;
determining a conversation topic related to the identified entity or entity class or other entity or entity instance or other entity class than the identified entity or entity class, from the conversation knowledge base (110) utilizing the conversational personalization engine; and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by initiating the personalized conversation with the user (102) based on the determined conversation topic.


13. The processor (130) implemented method as claimed in claim 1, wherein the method comprises
identifying at least one of a sibling entity, a sibling entity class, a child entity, a child entity class, a parent entity, a parent entity class, a super parent entity, a super parent entity class, a super child entity, a super child entity class, properties of the entity, or properties of the entity class, based on the identified entity or entity class using an entity ontology, wherein the conversation knowledge base (110) comprises the entity ontology which is a hierarchy representation of entities;
determining a conversation topic, in the conversation knowledge base (110) utilizing the conversational personalization engine, related to the at least one of sibling entity, sibling entity class, child entity, child entity class, parent entity, parent entity class, super parent entity, super parent entity class, super child entity, super child entity class, properties of the entity, or properties of the entity class based on one or more parameters of user interest; and
conversing or interacting, using the output unit of by the robot (108), with the user (102) by initiating the personalized conversation with the user (102) based on the determined conversation topic.


14. A system (100) for a robot (108) initiated personalized conversation with a user (102), based on surrounding and proclivity of the user (102), the system (100) comprising:
a robot (108) that initiates a personalized conversation with a user (102) based on the surrounding and proclivity of the user (102), wherein the robot (108) comprises
a processor (130); and
a memory that stores a conversation knowledge base (110) that comprises one or more conversation topic ontologies, and a set of instructions capable of being executed by the processor (130) to
detect, using one or more sensor units (111), at least one of environmental parameters thereafter to determine an environment event, a period of the environment event and an anomaly environment event by processing the detected environmental parameters, an environment alert, a time of an event related to the user (102), personal events related to the user (102), personal events related to user’s relatives or friends or acquaintances, an outdoor environment location, an apparel of the user (102), audio events , visual events, or news, in the surrounding and proclivity of the user (102),
characterized in that the processor (130) is configured to,
determine at least one conversation topic in the conversation knowledge base (110) utilizing a conversational personalization engine that is related to detected environmental parameters, the environment event, the period of the environment event, the environment alert, the time of the event related to the user (102), the personal events related to the user (102), the personal events related to user’s relatives or friends or acquaintances, the outdoor environment location, the apparel of the user (102), the audio events , the visual events, or the news;
converse or interact, using an output unit of the robot (108), with the user (102) by converting the at least one conversation topic into at least one output action and performing the at least one output action to converse or interact with the user (102);
identify at least one entity or entity class in a human robot interaction (HRI) during the personalized conversation by processing a conversation content during the personalized conversation that is uttered by the robot (108) or the user (102) and identifying the at least one entity or entity class in the conversation content, the at least one entity being a piece of word in the conversation content; and
converse or interact, using the output unit of by the robot (108), with the user (102) by converting the at least one conversation topic related to the at least one entity or entity class that is identified into at least one output action and performing the at least one output action to converse or interact with the user (102).


15. The system (100) as claimed in claim 14, wherein the processor (130) is configured to
identify at least one of a sibling entity, a sibling entity class, a child entity, a child entity class, a parent entity, or a parent entity class corresponding to the at least one entity that is identified in the personalized conversation using an entity ontology in the conversation knowledge base (110);
identify at least one of a super parent entity or a super parent entity class belonging to the parent entity or the parent entity class or a super child entity or a super child entity class belonging to the child entity, or the child entity class, wherein the entity ontology is a hierarchy representation of the entity where entities comprises the sibling entity, the sibling entity class, the child entity, the child entity class, the parent entity, the parent entity class, the super parent entity, the super parent entity class, the super child entity and the super child entity class;
determine at least one conversation topic, in the conversation knowledge base (110) utilizing the conversational personalization engine, related to the at least one of sibling entity, sibling entity class, child entity, child entity class, parent entity, parent entity class, super parent entity, super parent entity class, super child entity, or super child entity class based on one or more parameters of user interest; and
converse or interact, using the output unit of the robot (108), with the user (102) by converting the at least one conversation topic into at least one output action and performing the at least one output action to converse or interact with the user (102).


16. The system (100) as claimed in claim 14, wherein the processor (130) is configured to
identify at least one of entity or entity class in at least one of the detected environmental parameters, the environment event, the period of the environment event, the environment alert, the time of the event related to the user (102), the personal events related to the user (102), the personal events related to user’s relatives or friends or acquaintances, the outdoor environment location, the apparel of the user (102), the audio events, the visual events, or the news;
determine a conversation topic related to the identified entity or entity class or other entity or entity instance or entity class than the identified entity or entity class, from the conversation knowledge base (110) utilizing the conversational personalization engine; and
converse or interact, using the output unit of by the robot (108), with the user (102) by initiating the personalized conversation with the user (102) based on the determined conversation topic.


17. The system (100) as claimed in claim 14, wherein the processor (130) is configured to
identify at least one of a sibling entity, a sibling entity class, a child entity, a child entity class, a parent entity, a parent entity class, a super parent entity, a super parent entity class, a super child entity, a super child entity class, a properties of the entity, or a properties of the entity class, corresponding to the identified entity or entity class in an entity ontology, wherein the conversation knowledge base (110) comprises the entity ontology tree which is a hierarchy representation of entities;
determine a conversation topic, in the conversation knowledge base (110) utilizing the conversational personalization engine, related to the at least one of sibling entity, sibling entity class, child entity, child entity class, parent entity, parent entity class, super parent entity, super parent entity class, super child entity, super child entity class, properties of the entity, or properties of the entity class based on one or more parameters of user interest; and
converse or interact, using the output unit of by the robot (108), with the user (102) by initiating the personalized conversation with the user (102) based on the determined conversation topic.

Dated this March 15, 2022
Signature of the Agent:
Arjun Karthik Bala
(IN/PA 1021)
Agent for Applicant

Documents

Application Documents

# Name Date
1 202121012886-STATEMENT OF UNDERTAKING (FORM 3) [24-03-2021(online)].pdf 2021-03-24
2 202121012886-PROVISIONAL SPECIFICATION [24-03-2021(online)].pdf 2021-03-24
3 202121012886-PROOF OF RIGHT [24-03-2021(online)].pdf 2021-03-24
4 202121012886-POWER OF AUTHORITY [24-03-2021(online)].pdf 2021-03-24
5 202121012886-FORM FOR STARTUP [24-03-2021(online)].pdf 2021-03-24
6 202121012886-FORM FOR SMALL ENTITY(FORM-28) [24-03-2021(online)].pdf 2021-03-24
7 202121012886-FORM 1 [24-03-2021(online)].pdf 2021-03-24
8 202121012886-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [24-03-2021(online)].pdf 2021-03-24
9 202121012886-EVIDENCE FOR REGISTRATION UNDER SSI [24-03-2021(online)].pdf 2021-03-24
10 202121012886-DRAWINGS [24-03-2021(online)].pdf 2021-03-24
11 202121012886-DRAWING [17-03-2022(online)].pdf 2022-03-17
12 202121012886-CORRESPONDENCE-OTHERS [17-03-2022(online)].pdf 2022-03-17
13 202121012886-COMPLETE SPECIFICATION [17-03-2022(online)].pdf 2022-03-17
14 202121012886-FORM28 [06-04-2022(online)].pdf 2022-04-06
15 202121012886-Covering Letter [06-04-2022(online)].pdf 2022-04-06
16 Abstract1.jpg 2022-05-23
17 202121012886-FORM 3 [31-07-2022(online)].pdf 2022-07-31
18 202121012886-FORM-9 [26-09-2022(online)].pdf 2022-09-26
19 202121012886-STARTUP [29-09-2022(online)].pdf 2022-09-29
20 202121012886-FORM28 [29-09-2022(online)].pdf 2022-09-29
21 202121012886-FORM 18A [29-09-2022(online)].pdf 2022-09-29
22 202121012886-FER.pdf 2022-11-25
23 202121012886-OTHERS [25-05-2023(online)].pdf 2023-05-25
24 202121012886-FER_SER_REPLY [25-05-2023(online)].pdf 2023-05-25
25 202121012886-CORRESPONDENCE [25-05-2023(online)].pdf 2023-05-25
26 202121012886-COMPLETE SPECIFICATION [25-05-2023(online)].pdf 2023-05-25
27 202121012886-CLAIMS [25-05-2023(online)].pdf 2023-05-25
28 202121012886-FORM 3 [28-06-2023(online)].pdf 2023-06-28
29 202121012886-FORM 3 [28-09-2023(online)].pdf 2023-09-28
30 202121012886-US(14)-HearingNotice-(HearingDate-31-10-2023).pdf 2023-10-23
31 202121012886-Correspondence to notify the Controller [25-10-2023(online)].pdf 2023-10-25
32 202121012886-FORM 3 [02-11-2023(online)].pdf 2023-11-02
33 202121012886-Written submissions and relevant documents [10-11-2023(online)].pdf 2023-11-10
34 202121012886-PatentCertificate14-11-2023.pdf 2023-11-14
35 202121012886-IntimationOfGrant14-11-2023.pdf 2023-11-14
36 202121012886-FORM 3 [12-12-2023(online)].pdf 2023-12-12

Search Strategy

1 SearchstrategyE_25-11-2022.pdf

ERegister / Renewals

3rd: 22 Dec 2023

From 24/03/2023 - To 24/03/2024

4th: 22 Dec 2023

From 24/03/2024 - To 24/03/2025

5th: 22 Dec 2023

From 24/03/2025 - To 24/03/2026

6th: 22 Dec 2023

From 24/03/2026 - To 24/03/2027

7th: 22 Dec 2023

From 24/03/2027 - To 24/03/2028

8th: 22 Dec 2023

From 24/03/2028 - To 24/03/2029

9th: 22 Dec 2023

From 24/03/2029 - To 24/03/2030

10th: 22 Dec 2023

From 24/03/2030 - To 24/03/2031