Sign In to Follow Application
View All Documents & Correspondence

System And Method For Dynamic Job Allocation Based On Acoustic Sentiments

Abstract: The present disclosure relates to methods of systems for allocating a call from a user to an agent. Embodiments of the disclosure may determine a set of sentiment indicators associated with the user from one or more acoustic parameters of the call. In addition, embodiments of the disclosure may select a candidate agent to handle the call based on the set of sentiment indicators and a sentiment handling capability associated with the candidate agent. Moreover, embodiments of the disclosure may allocate the call to the candidate agent.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
06 June 2014
Publication Number
44/2015
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
ipo@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2021-10-21
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. Nithya Ramkumar
No 62, 3rd Cross, 17th Main, Koramangala II blk Extn, Bangalore – 560034, Karnataka, India.
2. Soham Bhaumik
804, Alps Block, Heritage Estate, Doddaballapur Road, Yelahanka, Bangalore – 560064, Karnataka, India.
3. Amit Krishna
Apartment # 12145, Prestige Shantiniketan Apartment, ITPL Road, Mahadevpura, White Field, Bangalore – 560048, Karnataka, India.
4. Mahesh Chowdary
25A, 3rd Cross, Subramanyam Layout, Rammurthy Nagar, Bangalore – 560016, Karnataka, India.
5. Hemant Kumar
195, Akash Darshan Apartment, Mayur Vihar Phase – 1, Delhi – 110091, India.

Specification

CLIAMS:We claim:

1. A method, implemented by a computer, for allocating a call from a user to an agent, the method comprising:
determining, by the computer, a set of sentiment indicators associated with the user from one or more acoustic parameters of the call;
selecting, by the computer, a candidate agent to handle the call based on the set of sentiment indicators and a sentiment handling capability associated with the candidate agent; and
allocating the call to the candidate agent.

2. The method of claim 1, comprising:
retrieving historical sentiment data associated with the user; and
selecting the candidate agent based on the historical sentiment data and the sentiment handling capability associated with the candidate agent.

3. The method of claim 1, wherein determining the set of sentiment indicators comprises:
measuring an acoustic parameter of a voice of the user;
determining a score associated with each sentiment indicator based on the measured acoustic parameter.

4. The method of claim 1, wherein the acoustic parameter includes at least one of:
a speaking intensity, a speaking rate, or presence of one or more pitches.

5. The method of claim 1, wherein selecting the candidate agent comprises:
determining a matching parameter indicating a difference between the set of sentiment indicators and sentiment handling capabilities of one or more available agents; and
selecting the candidate agent based on the matching parameter.

6. The method of claim 5, wherein:
the sentiment handling capability of each available agent includes a set of emotion handling ratings corresponding to the set of sentiment indicators;
the matching parameter includes a distance between a point representing the set of sentiment indicators and a point representing the set of emotion handling ratings associated with each available agent; and
the method comprises:
calculating the distance for each available agent; and
selecting the available agent having the shortest distance to be the candidate agent.

7. The method of claim 1, comprising:
analyzing a conversation between the user and the candidate agent; and
updating the sentiment handling capability associated with the candidate agent based on the conversation.

8. The method of claim 7, comprising:
monitoring the set of sentiment indicators associated with the user during the conversation;
determining whether the conversation proceeds into a positive or a negative direction based on the monitored set of sentiment indicators; and
automatically alerting the candidate agent when it is determined that the conversation proceeds into a negative direction.

9. A computer system for allocating a call from a user to an agent, the system comprising:
a processor operatively coupled to a memory device, wherein the processor is configured to execute instructions stored in the memory device to perform operations comprising:
determining a set of sentiment indicators associated with the user from one or more acoustic parameters of the call;
selecting a candidate agent to handle the call based on the set of sentiment indicators and a sentiment handling capability associated with the candidate agent; and
allocating the call to the candidate agent.

10. The system of claim 9, wherein the operations comprise:
retrieving historical sentiment data associated with the user; and
selecting the candidate agent based on the historical sentiment data and the sentiment handling capability associated with the candidate agent.

11. The system of claim 9, wherein determining the set of sentiment indicators comprises:
measuring an acoustic parameter from a voice of the user;
determining a score associated with each sentiment indicator based on the measured acoustic parameter.

12. The system of claim 9, wherein the acoustic parameter includes at least one of:
a speaking intensity, a speaking rate, or presence of one or more pitches.

13. The system of claim 9, wherein selecting the candidate agent comprises:
determining a matching parameter indicating a difference between the set of sentiment indicators and sentiment handling capabilities of one or more available agents; and
selecting the candidate agent based on the matching parameter.

14. The system of claim 13, wherein:
the sentiment handling capability of each available agent includes a set of emotion handling ratings corresponding to the set of sentiment indicators;
the matching parameter includes a distance between a point representing the set of sentiment indicators and a point representing the set of emotion handling ratings associated with each available agent; and
the operations comprise:
calculating the distance for each available agent; and
selecting the available agent having the shortest distance to be the candidate agent.

15. The system of claim 9, wherein the operations comprise:
analyzing a conversation between the user and the candidate agent; and
updating the sentiment handling capability associated with the candidate agent based on the conversation.

16. The system of claim 15, wherein the operations comprise:
monitoring the set of sentiment indicators associated with the user during the conversation;
determining whether the conversation proceeds into a positive or a negative direction based on the monitored set of sentiment indicators; and
automatically alerting the candidate agent when it is determined that the conversation proceeds into a negative direction.

17. A non-transitory, computer-readable medium storing instructions that, when executed by a processor device, cause the processor device to perform operations comprising:
determining a set of sentiment indicators associated with the user from one or more acoustic parameters of the call;
selecting a candidate agent to handle the call based on the set of sentiment indicators and a sentiment handling capability associated with the candidate agent; and
allocating the call to the candidate agent.

Dated this 6th day of June, 2014

R Ramya Rao
Of K&S Partners
Agent for the Applicant
,TagSPECI:TECHNICAL FIELD
This disclosure relates generally to customer service and support business. More specifically, it relates to a system and method for dynamically allocating a call from a customer to a customer service agent.

Documents

Application Documents

# Name Date
1 2799-CHE-2014 FORM-9 06-06-2014.pdf 2014-06-06
1 2799-CHE-2014-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30
2 2799-CHE-2014 FORM-18 06-06-2014.pdf 2014-06-06
2 2799-CHE-2014-PROOF OF ALTERATION [19-01-2022(online)].pdf 2022-01-19
3 2799-CHE-2014-Request For Certified Copy-Online(09-06-2014).pdf 2014-06-09
3 2799-CHE-2014-PatentCertificate21-10-2021.pdf 2021-10-21
4 IP27531-spec.pdf 2014-06-10
4 2799-CHE-2014-FER_SER_REPLY [11-09-2019(online)].pdf 2019-09-11
5 IP27531-fig.pdf 2014-06-10
5 2799-CHE-2014-FORM 3 [11-09-2019(online)].pdf 2019-09-11
6 FORM 5.pdf 2014-06-10
6 2799-CHE-2014-FER.pdf 2019-03-12
7 FORM 3.pdf 2014-06-10
7 2799-CHE-2014 CORRESPONDENCE OTHERS 02-09-2014.pdf 2014-09-02
8 2799-CHE-2014 FORM-1 02-09-2014.pdf 2014-09-02
8 2799CHE2014_CertifiedCopyRequest.pdf 2014-06-10
9 2799-CHE-2014 POWER OF ATTORNEY 02-09-2014.pdf 2014-09-02
10 2799CHE2014_CertifiedCopyRequest.pdf 2014-06-10
10 2799-CHE-2014 FORM-1 02-09-2014.pdf 2014-09-02
11 FORM 3.pdf 2014-06-10
11 2799-CHE-2014 CORRESPONDENCE OTHERS 02-09-2014.pdf 2014-09-02
12 FORM 5.pdf 2014-06-10
12 2799-CHE-2014-FER.pdf 2019-03-12
13 IP27531-fig.pdf 2014-06-10
13 2799-CHE-2014-FORM 3 [11-09-2019(online)].pdf 2019-09-11
14 IP27531-spec.pdf 2014-06-10
14 2799-CHE-2014-FER_SER_REPLY [11-09-2019(online)].pdf 2019-09-11
15 2799-CHE-2014-Request For Certified Copy-Online(09-06-2014).pdf 2014-06-09
15 2799-CHE-2014-PatentCertificate21-10-2021.pdf 2021-10-21
16 2799-CHE-2014-PROOF OF ALTERATION [19-01-2022(online)].pdf 2022-01-19
16 2799-CHE-2014 FORM-18 06-06-2014.pdf 2014-06-06
17 2799-CHE-2014-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30
17 2799-CHE-2014 FORM-9 06-06-2014.pdf 2014-06-06

Search Strategy

1 searchstrategy_15-05-2018.pdf

ERegister / Renewals

3rd: 19 Jan 2022

From 06/06/2016 - To 06/06/2017

4th: 19 Jan 2022

From 06/06/2017 - To 06/06/2018

5th: 19 Jan 2022

From 06/06/2018 - To 06/06/2019

6th: 19 Jan 2022

From 06/06/2019 - To 06/06/2020

7th: 19 Jan 2022

From 06/06/2020 - To 06/06/2021

8th: 19 Jan 2022

From 06/06/2021 - To 06/06/2022

9th: 27 May 2022

From 06/06/2022 - To 06/06/2023

10th: 01 Jun 2023

From 06/06/2023 - To 06/06/2024

11th: 03 Jun 2024

From 06/06/2024 - To 06/06/2025

12th: 06 Jun 2025

From 06/06/2025 - To 06/06/2026