Abstract: This disclosure relates generally to human-machine interaction. In one embodiment, an interaction device for providing the interaction between the user and the ECA is disclosed. The interaction device comprises a processor and a memory communicatively coupled to the processor. The memory stores processor instructions, which, on execution, causes the processor to receive conversation data of a user interacting with the ECA, wherein the ECA is presented on an interface of the interaction device. The processor further determines an emotional state of the user based on one or more behavioral parameters associated with the conversation data of the user. The processor identifies a response state for the ECA corresponding to the emotional state of the user, wherein the response state is identified from a plurality of response states based on a pre-defined probability for each response state. The processor further transitions behavior of the ECA based on the response state. FIG. 1
Claims:We Claim:
1. A method for providing an interaction between a user and an embodied conversational agent (ECA), the method comprising:
receiving, by an interaction device, conversation data of a user interacting with the ECA, wherein the ECA is presented on an interface of the interaction device;
determining, by the interaction device, an emotional state of the user based on one or more behavioral parameters associated with the conversation data of the user;
identifying, by the interaction device, a response state for the ECA corresponding to the emotional state of the user, wherein the response state is identified from a plurality of response states based on a pre-defined probability for each response state; and
transitioning, by the interaction device, behavior of the ECA based on the response state.
2. The method of claim 1, wherein the one or more behavioral parameters comprises facial expression of the user, conversation sentiment, audio sentiment, and historical user behavior data.
3. The method of claim 1, wherein determining the emotional state of the user comprises:
computing a user behavior score based on a weightage assigned to each behavioral parameter of the one or more behavioral parameters; and
determining the emotional state of the user by comparing the user behavior score with a pre-defined user behavior score.
4. The method of claim 1, wherein identifying the response state for the ECA corresponding to the emotional state of the user comprises:
determining the plurality of response states corresponding to the emotional state based on a pre-defined response matrix, wherein each response state of the plurality of response states is assigned the pre-defined probability; and
identifying the response state from the plurality of response states by performing a Monte Carlo sampling on pre-defined probabilities of the plurality of response states.
5. The method of claim 1, wherein transitioning the behavior of the ECA comprises transitioning visual appearance of the ECA.
6. The method of claim 5, wherein transitioning the visual appearance of the ECA comprises modifying at least one of facial expression of the ECA and one or more gestures of the ECA.
7. The method of claim 5, wherein transitioning the visual appearance of the ECA comprises:
determining an appearance transition sequence for the response state by mapping the response state with a pre-defined appearance transition matrix, wherein the appearance transition sequence comprises one or more appearance transitions; and
applying the appearance transition sequence on the ECA to transition the behavior of the ECA.
8. The method of claim 1, wherein transitioning the behavior of the ECA comprises modulating voice of the ECA by modifying at least one of pitch, volume of speech, speed of the speech, tone of the speech, and pause between words or statements in the speech.
9. An interaction device for providing an interaction between a user and an embodied conversational agent (ECA), the interaction device comprising:
a processor; and
a memory communicatively coupled to the processor, wherein the memory stores processor instructions, which, on execution, causes the processor to:
receive conversation data of a user interacting with the ECA, wherein the ECA is presented on an interface of the interaction device;
determine an emotional state of the user based on one or more behavioral parameters associated with the conversation data of the user;
identify a response state for the ECA corresponding to the emotional state of the user, wherein the response state is identified from a plurality of response states based on a pre-defined probability for each response state; and
transition behavior of the ECA based on the response state.
10. The interaction device of claim 9, wherein the one or more behavioral parameters comprises facial expression of the user, conversation sentiment, audio sentiment, and historical user behavior data.
11. The interaction device of claim 9, wherein the processor is configured to: compute a user behavior score based on a weightage assigned to each behavioral parameter of the one or more behavioral parameters; and
determine the emotional state of the user by comparing the user behavior score with a pre-defined user behavior score.
12. The interaction device of claim 9, wherein the processor is configured to:
determine the plurality of response states corresponding to the emotional state based on a pre-defined response matrix, wherein each response state of the plurality of response states is assigned the pre-defined probability; and
identify the response state from the plurality of response states by performing a Monte Carlo sampling on pre-defined probabilities of the plurality of response states.
13. The interaction device of claim 9, wherein the processor is configured to transition the behavior of the ECA by transitioning visual appearance of the ECA.
14. The interaction device of claim 13, wherein the processor is configured to transition the visual appearance of the ECA by modifying at least one of facial expression of the ECA and one or more gestures of the ECA.
15. The interaction device of claim 14, wherein the processor is configured to :
determine an appearance transition sequence for the response state by mapping the response state with a pre-defined appearance transition matrix, wherein the appearance transition sequence comprises one or more appearance transitions; and
apply the appearance transition sequence on the ECA to transition the behavior of the ECA.
16. The interaction device of claim 9, wherein transitioning the behavior of the ECA comprises modulating voice of the ECA by modifying at least one of pitch, volume of speech, speed of the speech, tone of the speech, and pause between words or statements in the speech.
Dated this 21st day of December, 2015
Swetha S.N.
Of K&S Partners
Agent for the Applicant
, Description:TECHNICAL FIELD
This disclosure relates generally to human-machine interaction, and more particularly to a system and method for providing interaction between a user and an Embodied Conversational Agent (ECA).
| # | Name | Date |
|---|---|---|
| 1 | Form 9 [21-12-2015(online)].pdf | 2015-12-21 |
| 2 | Form 5 [21-12-2015(online)].pdf | 2015-12-21 |
| 3 | Form 3 [21-12-2015(online)].pdf | 2015-12-21 |
| 4 | Form 18 [21-12-2015(online)].pdf | 2015-12-21 |
| 5 | Drawing [21-12-2015(online)].pdf | 2015-12-21 |
| 6 | Description(Complete) [21-12-2015(online)].pdf | 2015-12-21 |
| 7 | REQUEST FOR CERTIFIED COPY [22-12-2015(online)].pdf | 2015-12-22 |
| 8 | abstract 6809-CHE-2015.jpg | 2016-01-05 |
| 9 | REQUEST FOR CERTIFIED COPY [02-03-2016(online)].pdf | 2016-03-02 |
| 10 | 6809-CHE-2015-Power of Attorney-270416.pdf | 2016-07-13 |
| 11 | 6809-CHE-2015-Form 1-270416.pdf | 2016-07-13 |
| 12 | 6809-CHE-2015-Correspondence-F1-PA-270416.pdf | 2016-07-13 |
| 13 | 6809-CHE-2015-FER.pdf | 2019-12-26 |
| 14 | 6809-CHE-2015-OTHERS [17-06-2020(online)].pdf | 2020-06-17 |
| 15 | 6809-CHE-2015-FER_SER_REPLY [17-06-2020(online)].pdf | 2020-06-17 |
| 16 | 6809-CHE-2015-DRAWING [17-06-2020(online)].pdf | 2020-06-17 |
| 17 | 6809-CHE-2015-COMPLETE SPECIFICATION [17-06-2020(online)].pdf | 2020-06-17 |
| 18 | 6809-CHE-2015-CLAIMS [17-06-2020(online)].pdf | 2020-06-17 |
| 19 | 6809-CHE-2015-ABSTRACT [17-06-2020(online)].pdf | 2020-06-17 |
| 20 | 6809-CHE-2015-PA [03-02-2022(online)].pdf | 2022-02-03 |
| 21 | 6809-CHE-2015-ASSIGNMENT DOCUMENTS [03-02-2022(online)].pdf | 2022-02-03 |
| 22 | 6809-CHE-2015-8(i)-Substitution-Change Of Applicant - Form 6 [03-02-2022(online)].pdf | 2022-02-03 |
| 23 | 6809-CHE-2015-US(14)-HearingNotice-(HearingDate-22-05-2023).pdf | 2023-04-19 |
| 24 | 6809-CHE-2015-Correspondence to notify the Controller [25-04-2023(online)].pdf | 2023-04-25 |
| 1 | 6809search_23-12-2019.pdf |