Sign In to Follow Application
View All Documents & Correspondence

Method And System Of Generating Dialogue Responses For Users In Real‐Time 

Abstract: The present disclosure discloses method and response generation system for generating dialogue responses for users in real-time. The response generation system receives pair of utterance from dialogue system of plurality of dialogue systems, where pair of utterance comprise conversation between user and dialogue system, identifies one or more labels from utterance of pair of utterance that corresponds to user, where one or more labels exhibit relationship. One or more intent corresponding to each of identified one or more labels is determined, where one or more intent is restricted according to current context of utterance of user. A polarity of each intent is determined based on pre-defined scale comprising left bound and right bound and one or more responses for user are generated until determined polarity for each of labels from identified one or more labels is within pre-defined distance of left bound or right bound of pre-defined scale. Fig.1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 March 2018
Publication Number
40/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
bangalore@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-10-13
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. MEENAKSHI SUNDARAM MURUGESHAN
No 1264, Ground Floor, 15th Main, BTM 2nd Stage, Bangalore-560076, Karnataka, India.
2. BALAJI JAGAN
G01, Kristal Amber II, 8th Main, 14th Cross, N S Palya, BTM Stage 2, Bangalore 560076, Karnataka, India.

Specification

Claims:We claim:
1. A method of generating dialogue responses for users in real-time, the method comprising:
receiving, by a response generation system (101), a pair of utterance from a dialogue system of a plurality of dialogue systems (103), wherein the pair of utterance comprise conversation between a user and the dialogue system;
identifying, by the response generation system (101), one or more labels from an utterance of the pair of utterance that corresponds to the user, wherein the one or more labels exhibit a relationship;
determining, by the response generation system (101), one or more intents corresponding to each of the identified one or more labels, wherein the one or more intents are restricted according to a current context of the utterance of the user;
determining, by the response generation system (101), a polarity of each of the determined one or more intents on a pre-defined scale comprising a left bound and a right bound; and
generating, by the response generation system (101), one or more responses for the user until the determined polarity for each of the labels, from the identified one or more labels, bound by the relationship, is within a pre-defined distance of the left bound or the right bound of the pre-defined scale.

2. The method as claimed in claim 1, wherein the pair of utterance comprises one or more instructions and clarification questions from the dialogue system and corresponding response provided by the user.

3. The method as claimed in claim 1, wherein the identification of the one or more labels is based on a decision tree comprising one or more nodes indicative of a label or an expected state of the label.

4. The method as claimed in claim 1, wherein the determination of the one or more intents is based on:
identifying one or more frames from the utterance of the user, wherein each of the one or more frames correspond to the identified one or more labels;
identifying one or more pre-defined frame elements corresponding to each of the identified one or more frames, wherein each frame element is pre-annotated with a pre-defined set of frames specific to a domain;
determining an expected state information corresponding to each of the identified one or more frame elements, wherein the expected state information is indicative of the one or more intents of the user with respect to the identified one or more labels.

5. The method as claimed in claim 4, wherein the pre-annotation of the one or more frame elements is performed based on:
enriching each frame element of the one or more frame elements based on manual inputs received from a user; and
determining prominent frame elements characterizing the domain based on a spectral clustering and a re-ranking of each frame element.
6. The method as claimed in claim 1, wherein the response for the user comprise one or more clarification questions for the user.

7. The method as claimed in claim 1, wherein each of the one or more labels corresponds to a label from a plurality of predetermined labels stored in a database.

8. The method of claim 1, wherein the left bound is indicative of negative polarity of an expected state of the identified one or more label, and the right bound is indicative of a positive polarity of the expected state of the identified one or more labels.

9. A response generation system (101) for generating dialogue responses for users in real-time, comprising:
a processor (113); and
a memory (111) communicatively coupled to the processor (113), wherein the memory (111) stores processor instructions, which, on execution, causes the processor (113) to:
receive a pair of utterance from a dialogue system of a plurality of dialogue systems (103), wherein the pair of utterance comprise conversation between a user and the dialogue system;
identify one or more labels from an utterance of the pair of utterance that corresponds to the user, wherein the one or more labels exhibit a relationship;
determine one or more intents corresponding to each of the identified one or more labels, wherein the one or more intents are restricted according to a current context of the utterance of the user;
determine a polarity of each of the determined one or more intents on a pre-defined scale comprising a left bound and a right bound; and
generate one or more responses for the user until the determined polarity for each of the labels, from the identified one or more labels, bound by the relationship, is within a pre-defined distance of the left bound or the right bound of the pre-defined scale.

10. The response generation system (101) as claimed in claim 9, wherein the pair of utterance comprises one or more instructions and clarification questions from the dialogue system and corresponding response provided by the user.

11. The response generation system (101) as claimed in claim 9, wherein the identification of the one or more labels is based on a decision tree comprising one or more nodes indicative of a label or an expected state of the label.

12. The response generation system (101) as claimed in claim 9, wherein the determination of the one or more intents is based on:
identifying one or more frames from the utterance of the user, wherein each of the one or more frames correspond to the identified one or more labels;
identifying one or more pre-defined frame elements corresponding to each of the identified one or more frames, wherein each frame element is pre-annotated with a pre-defined set of frames specific to a domain;
determining an expected state information corresponding to each of the identified one or more frame elements, wherein the expected state information is indicative of the one or more intents of the user with respect to the identified one or more labels.

13. The response generation system (101) as claimed in claim 12, wherein the pre-annotation of the one or more frame elements is performed based on:
enriching each frame element of the one or more frame elements based on manual inputs received from a user; and
determining prominent frame elements characterizing the domain based on a spectral clustering and a re-ranking of each frame element.
14. The response generation system (101) as claimed in claim 9, wherein the response for the user comprise one or more clarification questions for the user.
15. The response generation system (101) as claimed in claim 9, wherein each of the one or more labels corresponds to a label from a plurality of predetermined labels stored in a database.

16. The response generation system (101) as claimed in claim 9, wherein the left bound is indicative of negative polarity of an expected state of the identified one or more label, and the right bound is indicative of a positive polarity of the expected state of the identified one or more labels.

Dated this 28th day of March, 2018

R Ramya Rao
Of K&S Partners
Agent for the Applicant
IN/PA-1607
, Description:TECHNICAL FIELD
The present subject matter is related in general to field of virtual assistance, more particularly, but not exclusively to method and system for generating dialogue responses for users in real-time.

Documents

Orders

Section Controller Decision Date
15 grant Subhra banerjee 2023-10-13
15 grant Subhra banerjee 2023-10-13

Application Documents

# Name Date
1 201841011536-STATEMENT OF UNDERTAKING (FORM 3) [28-03-2018(online)].pdf 2018-03-28
2 201841011536-REQUEST FOR EXAMINATION (FORM-18) [28-03-2018(online)].pdf 2018-03-28
3 201841011536-POWER OF AUTHORITY [28-03-2018(online)].pdf 2018-03-28
4 201841011536-FORM 18 [28-03-2018(online)].pdf 2018-03-28
5 201841011536-FORM 1 [28-03-2018(online)].pdf 2018-03-28
6 201841011536-DRAWINGS [28-03-2018(online)].pdf 2018-03-28
7 201841011536-DECLARATION OF INVENTORSHIP (FORM 5) [28-03-2018(online)].pdf 2018-03-28
8 201841011536-COMPLETE SPECIFICATION [28-03-2018(online)].pdf 2018-03-28
9 abstract 201841011536.jpg 2018-04-02
10 201841011536-REQUEST FOR CERTIFIED COPY [03-05-2018(online)].pdf 2018-05-03
11 201841011536-Proof of Right (MANDATORY) [30-07-2018(online)].pdf 2018-07-30
12 Correspondence by Agent_Form 1_01-08-2018.pdf 2018-08-01
13 201841011536-RELEVANT DOCUMENTS [19-07-2021(online)].pdf 2021-07-19
14 201841011536-PETITION UNDER RULE 137 [19-07-2021(online)].pdf 2021-07-19
15 201841011536-OTHERS [19-07-2021(online)].pdf 2021-07-19
16 201841011536-Information under section 8(2) [19-07-2021(online)].pdf 2021-07-19
17 201841011536-FORM 3 [19-07-2021(online)].pdf 2021-07-19
18 201841011536-FER_SER_REPLY [19-07-2021(online)].pdf 2021-07-19
19 201841011536-DRAWING [19-07-2021(online)].pdf 2021-07-19
20 201841011536-CORRESPONDENCE [19-07-2021(online)].pdf 2021-07-19
21 201841011536-COMPLETE SPECIFICATION [19-07-2021(online)].pdf 2021-07-19
22 201841011536-CLAIMS [19-07-2021(online)].pdf 2021-07-19
23 201841011536-FER.pdf 2021-10-17
24 201841011536-US(14)-HearingNotice-(HearingDate-28-07-2023).pdf 2023-07-14
25 201841011536-POA [25-07-2023(online)].pdf 2023-07-25
25 201841011536-DECLARATION OF INVENTORSHIP (FORM 5) [28-03-2018(online)].pdf 2018-03-28
26 201841011536-FORM 13 [25-07-2023(online)].pdf 2023-07-25
26 201841011536-DRAWINGS [28-03-2018(online)].pdf 2018-03-28
27 201841011536-FORM 1 [28-03-2018(online)].pdf 2018-03-28
27 201841011536-Correspondence to notify the Controller [25-07-2023(online)].pdf 2023-07-25
28 201841011536-FORM 18 [28-03-2018(online)].pdf 2018-03-28
28 201841011536-AMENDED DOCUMENTS [25-07-2023(online)].pdf 2023-07-25
29 201841011536-POWER OF AUTHORITY [28-03-2018(online)].pdf 2018-03-28
29 201841011536-Written submissions and relevant documents [11-08-2023(online)].pdf 2023-08-11
30 201841011536-PatentCertificate13-10-2023.pdf 2023-10-13
30 201841011536-REQUEST FOR EXAMINATION (FORM-18) [28-03-2018(online)].pdf 2018-03-28
31 201841011536-IntimationOfGrant13-10-2023.pdf 2023-10-13
31 201841011536-STATEMENT OF UNDERTAKING (FORM 3) [28-03-2018(online)].pdf 2018-03-28

Search Strategy

1 SearchStrategy11536E_12-01-2021.pdf

ERegister / Renewals

3rd: 01 Jan 2024

From 28/03/2020 - To 28/03/2021

4th: 01 Jan 2024

From 28/03/2021 - To 28/03/2022

5th: 01 Jan 2024

From 28/03/2022 - To 28/03/2023

6th: 01 Jan 2024

From 28/03/2023 - To 28/03/2024

7th: 19 Mar 2024

From 28/03/2024 - To 28/03/2025

8th: 28 Mar 2025

From 28/03/2025 - To 28/03/2026