Abstract: AI based method for routing an incoming call from an end user to an appropriate service agent, are disclosed. Said method describes identifying (302) the end user based on the incoming call. Once, the end user is identified, the method describes predicting (306) at least one probable reason for said incoming call and determining (308) a tolerance level of the identified end user for said at least one predicted probable reason. Said method further describes determining (310) an emotional state of the end user. Moving ahead, the method describes generating (312) report regarding the incoming call, based on the at least one probable reason for said incoming call, the tolerance level of the identified end user, and the emotional state of the end user. and routing (314) the incoming call to the appropriate service agent along with report.
F O R M 2
THE PATENTS ACT, 1970
(39 of 1970)
The patent Rule, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
TITLE OF THE INVENTION
SYSTEM FOR ROUTING AN INCOMING CALL FROM AN END USER TO AN APPROPRIATE SERVICE AGENT
APPLICANT:
Zensar Technologies Limited, A company Incorporated in India under the Companies Act, 1956
Having address:
Zensar knowledge park,
Plot # 4, MIDC, Kharadi, off Nagar road, Pune-411014,
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims priority from Indian Provisional Patent Application No. 202021038486 filed on September 7, 2020, the entirety of which is incorporate herein by a reference.
TECHNICAL FIELD
[0002] The present subject matter described herein, in general, discloses an artificial intelligence (AI) based techniques for routing an incoming call from an end user to an appropriate service agent, by determining the reason for the call, in advance.
BACKGROUND
[0003] Customer satisfaction plays a pivotal role for an organization in establishing relationship with their customers. Business Process Outsourcing (BPO) is the measure took by many organization towards establishing the relationship and attaining the profitability objectives. Every time when a customer (end user) encounters some challenge with operability of the product/services offered by the organization, the customer (end user) tends to dial in a call center to seek resolution from a customer agent (service agent) to the encountered challenge.
[0004] It has been noticed that currently, customers (end users) either have to spend long time on an Interactive voice response (IVR) based system so as to reach the intended department for clarifications or have to explain their reason to a customer agent (service agent) who may then transfer their call to the relevant department that can clarify the customers (end users) concerns appropriately. Further, in certain cases even after reaching to the relevant department, the customer agent (service agent) attending the call may not be aware of context and thus an intended response may not be fed for the customer’s (end user’s) consumption, leading to customer dissatisfaction. [0005] In such instances, when the context or reason of the call is not known beforehand, there is a likelihood that the customer agent (service agent) may become aggressive which further leads to the customer (end user) dissatisfaction. Further the current technologies do not try to utilize per customer (end user) level emotional understanding which may proactively indicate the customer agent (service agent) about the customer’s (end user’s) behavior. In view of the above, there exists a need of a system and method that overcome the challenges existing in the art. In particular, there
exist a need in the art for a technology that automatically routes the customer’s (end user’s) call to customer agent (service agent), based on a probable reason of call and customer’s behavior along with relevant information, in real-time, which may be helpful to respond to the customer’s (end user’s) query thereby improving the customer’s (end user’s) satisfaction level.
SUMMARY
[0006] The present disclosure overcomes one or more shortcomings of the prior art and provides additional advantages discussed throughout the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
[0007] In one non-limiting embodiment of the present disclosure an Artificial Intelligence (AI) based method for routing an incoming call from an end user to an appropriate service agent is disclosed. The method describes identifying, by a processor, the end user based on the incoming call. The method further describes retrieving, by the processor, at least one of context data and historical data, for the identified end user, from a profile database. It is to be appreciated that the retrieved context data is at least indicative of transaction history of the end user, and the retrieved historical data is at least indicative of demographic details of the end user. Thereafter, the method describes predicting, by an identification modelling unit, at least one probable reason for said call based on the retrieved context data, the retrieved historical data and a first set of parameters using a pre-trained AI model. It is to be appreciated that said model is pre-trained to predict the probable reasons by vectorizing the first set of parameters using an encoding technique and classifying by a Support Vector Machine (SVM) classifier, the probable reason from the vectorized first set of parameters.
[0008] In addition to the above, the method further describes determining, by a tolerance level modelling unit, a tolerance level of the identified end user using the pre-trained AI model, for the at least one predicted probable reason. It is to be appreciated that the said model is pre-trained to determine the tolerance level by classifying a second set of parameters using a decision tree classifier. The method further describes
determining, by an emotional state modelling unit, an emotional state of the end user from a user profile of the identified end user, using the pre-trained AI model. In an aspect, said model is pre-trained to determine the emotional state by vectorizing a third set of parameters using the encoding technique and classifying by a Logistic Regression (LR) classifier, an emotional state from the vectorized third set of parameters. Further, the method describes generating, by the processor, a report based on the at least one of the predicted probable reason for the call, the determined tolerance level of the end user for said predicted probable reason and the determined emotional state of the end user. Lastly, the method describes routing, by the processor, said incoming call from the end user to the appropriate service agent based on said report.
[0009] In yet another non-limiting embodiment of the present disclosure, an Artificial Intelligence (AI) system for routing an incoming call from an end user to an appropriate service agent is disclosed. The system comprises a processor configured to identify the end user based on the incoming call and retrieve at least one of context data and historical data, for the identified end user, from a profile database. It is to be appreciated that the retrieved context data is at least indicative of transaction history of the end user, and the retrieved historical data is at least indicative of demographic details of the end user.The system further comprises an identification modelling unit coupled to said processor and configured to predict at least one probable reason for said call based on the retrieved context data, the retrieved historical data and a first set of parameters using a pre-trained AI model. Further, it is to be appreciated that said model is pre-trained to predict the probable reasons for the call by vectorizing the first set of parameters using an encoding technique and classifying by a Support Vector Machine (SVM) classifier, the probable reason from the vectorized set of parameters. The system further comprises a tolerance level modelling unit coupled to said processor and configured to determine a tolerance level of the identified end user using the pre-trained AI model, for the at least one predicted probable reason. It is to be appreciated that said model is pre-trained to determine the tolerance level by classifying a second set of parameters using a decision tree classifier. The system further comprises an emotional state modelling unit coupled to said processor and configured to determine an emotional state of the end user from a user profile of the identified end user using the
pre-trained AI model. It may be noted that said model is pre-trained to determine the emotional state by vectorizing a third set of parameters using the encoding technique and classifying by a Logistic Regression (LR) classifier, an emotional state from the vectorized third set of parameters.
[0010] In addition to above, the processor is further configured to generate a report based on the at least one of the predicted probable reason for the call, the determined tolerance level of the end user for said predicted probable reason and the determined emotional state of the end user. Lastly, the processor is further configured to route said incoming call from the end user to the appropriate service agent based on the report.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The accompanying drawings, which are incorporated in and constitute a part
of this disclosure, illustrate exemplary embodiments and, together with the description,
serve to explain the disclosed embodiments. In the figures, the left-most digit(s) of a
reference number identifies the figure in which the reference number first appears. The
same numbers are used throughout the figures to reference like features and
components. Some embodiments of system and/or methods in accordance with
embodiments of the present subject matter are now described, by way of example only,
and with reference to the accompanying figures, in which:
[0012] Fig. 1 illustrates a network implementation of an AI enabled system for
routing an incoming call from an end user to an appropriate service agent, in accordance
with an embodiment of the present subject matter.
[0013] Fig. 2 illustrates a block diagram of an AI enabled system for routing an
incoming call from an end user to an appropriate service agent, in accordance with an
embodiment of the present subject matter.
[0014] Fig. 3 is a flow diagram illustrating an AI based method for routing an
incoming call from an end user to an appropriate service agent, in accordance with an
embodiment of the present subject matter.
[0015] Fig. 4 illustrates a process for routing an incoming call from an end user to
an appropriate service agent, by way of various modules, in accordance with an
embodiment of the present subject matter.
DETAILED DESCRIPTION
[0016] The terms “comprises”, “comprising”, “include(s)”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, system or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or system or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or apparatus.
[0017] In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0018] The present disclosure discloses an Artificial Intelligence (AI) enabled system for routing an incoming call from a customer (also referred as end user) to an appropriate customer agent (also referred as service agent of the customer) upon analyzing customer’s behavior and a predicted reason for calling a call center. Precisely, the AI enabled system generates a report based on features such as a probable reason for the call by the end user, a tolerance level of the end user for said probable reason and an emotional state of the end user. The AI enabled system determines each of said features based on different contextual and historical parameters and thereafter routes the incoming call augmented with said report to an appropriate service agent. This report is utilized by the service agent to get initial understanding of context of the end user and to respond to user’s query accordingly.
[0019] By the above, the AI enabled system proactively predicts the probable reason of the current call as soon as the end user’s call is established with service agent’s system. In particular, the AI enabled system, based on the generated report, may determine a department having an expertise in handling the end user’s call. Further,
based on the generated report, the AI enabled system may select the service agent from said department, having an expertise in handling the end user’s call with similar profiles and forwards said incoming call to the selected service agent along with a copy of said report. This is helpful in avoiding the hassle for the end user to spend much time on an IVR system.
[0020] Referring now to Fig. 1, a network implementation 100 of an AI enabled system 102 for routing an incoming call from an end user to an appropriate service agent, is disclosed. In one aspect, the AI enabled system 102 may be deployed on any existing IVR systems. Although the present disclosure is explained considering that the AI enabled system 102 is implemented on a server, it may be understood that the AI enabled system 102 may be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, a cloud-based computing environment. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2…104-N, collectively referred to as end user or user 104 or stakeholders, hereinafter, or applications residing on the user devices 104. In one implementation, the AI enabled system 102 may comprise the cloud-based computing environment in which the user may operate individual computing systems configured to execute remotely located applications. Examples of the user devices 104-1, 104-2…104-N may include, but are not limited to, a IoT device, IoT gateway, portable computer, a personal digital assistant, a handheld device, and a workstation. The user devices 104 are communicatively coupled to the AI enabled system 102 through a network 106.
[0021] Further, in an implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Hypertext Transfer Protocol Secure (HTTPS), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP),
and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0022] Further referring now to Fig. 2, the AI enabled system 102 is illustrated in accordance with an embodiment of the present disclosure. In one embodiment, the AI enabled system 102 may include a processor 202, an AI model 204, a memory 206, an input/output (I/O) unit 220, a profile database 218, a historical database 214, and a context database 216. The AI model 204 may include an identification modelling unit 208, a tolerance level modelling unit 210, an emotional state modelling unit 212. The processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206. It may be worth noted that the processor 202 is in communication with each of the abovementioned units 204, 206, 208, 210, 214, 216, 218, 220 to perform the functions described in the paragraphs below.
[0023] The I/O unit 220 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O unit 220 may allow the AI enabled system 102 to interact with through the user devices 104. Further, the I/O unit 220 may enable the AI enabled system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O unit 220 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O unit 220 may include one or more ports for connecting many devices to one another or to another server.
[0024] According to an embodiment, the AI enabled system 102 routes an incoming call from an end user to an appropriate service agent whenever a call center receives the call from the end user. Precisely, the processor 202 of the AI enabled system 102 is configured to identify the end user based on the incoming call. To identify end user, the
processor 202 may map contact details of the incoming call with contact details pre-stored in the historical database 214. Thereafter, the processor 202 may compare metadata associated with the incoming call with metadata associated with the mapped contact details from the historical database 214, to extract the end user’s identification number. In a non-limiting example, the contact details of the incoming call may be the mobile number/calling number of the end user along with his/her name. [0025] After the end user is identified, the processor 202 may retrieve at least one of context data and historical data, for the identified end user, from a profile database 218. In a non-limiting example, the context data may include, but not limited to, recent transaction of the end user, transaction location of the end user, other parties involved in transaction, current status of the transaction, current location of the end user, a transaction which may have triggered escalations or calls to the call center, time of the call, tolerance level of specific end user towards a given type of reason, kind of incidents possible with the predicted reason of the incoming call and like parameters. The historical data may include, but not limited to, worth of the end user, type of the end user gender, age, yearly or monthly subscriptions of the end user, how frequently the end user calls the call center, time of the call, current issue, last issue status such as whether issue was resolved or in progress, emotional state of the end user, time sensitivity of the end user, actual location of the end user and like parameters. [0026] From the above, it may be noted that the retrieved context data is indicative of transaction history of the end user, and the retrieved historical data is indicative of demographic details of the end user.
[0027] Thereafter, the identification modelling unit 208 predicts a probable reason for the incoming call based on one of the retrieved context data, the retrieved historical data and a first set of parameters using a pre-trained AI model 204. The first set of parameters may include, but not limited to, a recent service used by the end user, recent transaction of the end user, transaction location of the end user, other parties involved in transaction, current status of the transaction, current location of the end user, and data from previous communications such as chat data from chatbots or text data from mail threads along with audio, video call interactions, social networking posts
published by a plurality of end users related to the service, organization’s sale and complaint data associated to a plurality of the end users.
[0028] It must be understood that the AI model 204 is pre-trained to predict probable reasons of a call by vectorizing the first set of parameters using an encoding technique such as hot encoding and thereafter classifying a probable reason from the vectorized set of parameters by using a Support Vector Machine (SVM) classifier. [0029] In one implementation, if several other end users have been calling for the same reason, the identification modelling unit 208 may analyse profile of said other end users to predict the at least one probable reason for said call. The identification modelling unit 208 utilizes the retrieved context data and the retrieved historical data to determine one or more other end users who had called in the past for a reason which is similar to the predicted probable reason. The identification modelling unit 208 may further analyze the user profile of each of said one or more other end users and said end user and calculates a similarity score between them. The similarity score is calculated using a cosine similarity function. The cosine similarity is defined as:
Where A and B are two end user profiles and similarity (A, B) is the similarity score. The identification modelling unit 208 may compare the calculated similarity score with a predefined threshold. On comparison, if it is determined that the similarity score between said end user profile and the profile of each of said one or more other end users is greater than the predetermined threshold, the identification modelling unit 208 predicts the at least one probable reason for the incoming call, by analyzing the user profile of each of said one or more other end users. On the other hand, if it is determined that the similarity score between said end user profile and the profile of each of said one or more other end users is less than the predetermined threshold, all such user profiles of said one or more other end users will not be considered to predict the at least one probable reason for said call.
[0030] Further, in one exemplary embodiment, the tolerance level modelling unit 210 as shown in Fig. 2 determines a tolerance level, of the identified end user using the
pre-trained AI model 204, for the at least one predicted probable reason. Said model 204 is pre-trained to determine the tolerance level by classifying a second set of parameters using a decision tree classifier. In one non-limiting example, the second set of parameters include worth of the end user, type of the end user, gender, age, yearly or monthly subscriptions of the end user, how frequently the end user calls the call center, time of the call, current issue, last issue status such as whether issue was resolved or in progress, and end user rating to previous issues.
[0031] In an illustrative embodiment, it is to be acknowledged that to calculate the tolerance level using the pre-trained AI model 204, the tolerance level modelling unit 210 calculates an entropy for each one or more parameters selected from the second set of parameters and splits the one or more parameters into subsets of the one or more parameters based on calculated entropy. In one non-limiting example, range of the calculated entropy lies in between 0 to 1. Further, the tolerance level modelling unit 210 may classify the subsets of the one or more parameters, by using the decision tree classifier. Thereafter, based on classification, the tolerance level modelling unit 210 determines in real-time a tolerance score to classify the end user as one of highly tolerant [if the tolerance score is between 0.8 to 1], moderately tolerant [if the tolerance score is between 0.4 to 0.7] and low tolerant [if the tolerance score is between 0 to 0.3]. It is to be appreciated that said range covers only an exemplary embodiment and the present disclosure is not limited to same.
[0032] Once the at least one probable reason of the call and the tolerance level of the end user for said probable reason are determined, the emotional state modelling unit 212 determines an emotional state of the end user from a user profile of the identified end user. In a non-limiting example, the user profile may include information associated with user such as name, age, gender, location, context of interest, knowledge or expertise, product or service purchased, known attitudes in relation to his/her interaction with the product or service, breadth of usage. The emotional state modelling unit 212 determines the emotional state by using the pre-trained AI model 204. It may be understood that said AI model 204, to determine the emotional state, said AI model 204 is pre-trained by vectorizing a third set of parameters by using the encoding technique, and classifying an emotional state from the vectorized third set of
parameters, using a Logistic Regression (LR) classifier. In a non-limiting example, the third set of parameters may include pitch intensity, gender, age, geographical location of the end user, and historical communication data of the end user along with end user service agent’s profile that handled such communications.
[0033] In one implementation, to determine the emotional state of the end user using the pre-trained AI model 204, the emotional state modelling unit 212 may identify one or more parameters from the user profile, and vectorize said one or more identified parameters using the encoding technique. Further, the emotional state modelling unit 212 classifies the vectorized set of one or more identified parameters, by using the LR classifier to determine an emotional score. Thereafter, the emotional state modelling unit 212 determines the emotional state to identify the end user as at least one of soft spoken or aggressive. In a non-limiting example, if the determined emotional score lies between 0 to 0.5, the end user may be considered as soft spoken and if the determined emotional score lies between 0.6 to 1, the end user may be considered as aggressive. It is to be appreciated that said range covers only an exemplary embodiment and the present disclosure is not limited to same.
[0034] Coming back to Fig. 2, once, the at least one probable reason for the incoming call is predicted and the tolerance level and the emotional state of the end user are determined, the processor 202 generates a report indicating name or identification number of the end user, the at least one of the predicted probable reason for the call, the determined tolerance level of the end user for said predicted probable reason and the determined emotional state of the end user. In one exemplary embodiment, the processor 202 generates the report based on data analysis performed on the predicted probable reason for the call, the determined tolerance level of the end user for said predicted probable reason and the determined emotional state of the end user. In a non-limiting example, the data analysis may be performed using available quantitative and qualitative data analysis techniques. The available quantitative data analysis techniques, which may be suitably used herein, but not limited thereto, are Regression analysis, Monte Carlo simulation, Factor analysis, Cohort analysis, Cluster analysis. Similarly, the available qualitative data analysis technique(s), which may be suitably used herein, but not limited thereto, is Sentiment Analysis.
[0035] Further, the processor 202 routes the incoming call from the end user to the appropriate service agent based on the report. Said report can be referred by the service agent before answering the incoming call. In one exemplary embodiment, to route the call, the processor 202 determines a department having an expertise in handling the end user’s call. To determine the department, the processor 202 matches the at least one probable reason of call indicated in the report with a list of departments available in the call center having expertise in handling matters with same/similar reasons. Based on the matching, the processor 202 determines a best matched department from the list of departments. Once the department is determined, the processor 202 may select the service agent having an expertise in handling the end user’s queries with similar profiles and forward the incoming call to an appropriate service agent of the department based on the report along with a copy of said report.
[0036] According to an embodiment of the present disclosure, the above discussed units 208, 210, 212 may be dedicated hardware units capable of executing one or more instructions stored in the memory 206 for performing various operations of the AI enabled system 102. In another embodiment, the units 208, 210, 212 may be software modules stored in the memory 206 which may be executed by the processor 202 for performing the operations of the AI enabled system 102.
[0037] The embodiments described above, may be easily understood by way of following example.
Step 1: When an end user calls at a call center, the AI enabled system 102 identifies
the end user based on the incoming call to extract the end user’s identification number.
Ex: The end user is identified as user A having mobile number
+9170XXXXXXXX [end user’s identification number]
Step 2: For the above identified user A, the AI enabled system 102 retrieves at least transaction history of the end user (contextual data) and at least demographic details of the end user (historical data). Based on the at least one of the retrieved context data, the retrieved historical data and a first set of parameters (as indicated in the example below) using the pre-trained AI model 204, the AI enabled system 102 may predict at least one probable reason of call.
Ex: Suppose an organization offers three different services such as EMI, loans,
card services etc to identified user A. As previously discussed, the AI enabled system 102 utilizes one hot encoding to convert available data related to said services into textual data. Further, the AI enabled system 102 utilizes one hot encoding to create additional features based on the number of unique values in the textual feature/data. Every unique value in the category will be added as a feature to create binary vector.
i.e. Emi would be 001;
Loan would be 010;
Card would be 100.
For example, the user A uses a service such as ‘card’ of the organization, the parameters
identified by the AI enabled system 102 in this case would be:
Recent service used by the end user
Ex: Card - 100
recent transaction of the user
Ex: debit/credit
transaction location of the end user
Ex: Delhi/Hyderabad
Parties involved in transaction
Ex: DirecPay/Paypal
Current status of the transaction
Ex: Success/failure
Current location of the end user (if available)
Ex: Andhra Pradesh
Twitter data [for Last N days data related to service]
The social platform [ex: Twitter] for the organization would be monitored to retrieve data containing these services for last N days [configurable].
Each tweet would be passed through a sentiment analyser to understand overall public opinion for the service currently. To calculate sentiment, the AI enabled system 102 preferably utilizes a Vader Sentiment analysis.
Ex: tweet: I had multiple failed transactions using SBI credit card. #SBI please
answer!
Sentiment= {'neu': 0.189, 'neg': 0.515, 'pos': 0.297, 'compound': 0.2244}
So, sentiment of that statement is – 0.515. All the sentiments for a service would be averaged to get consolidated opinion. For example, sentiment of a service= Sum of sentiments of each tweet related to a service/ total tweets related to that service. This opinion would be helpful to predict if the end user that has called is also facing any ongoing issue with the services. The output of twitter analysis would be [-0.2, 0.8, -0.4] etc for three services (EMI, loans, card services). Therefore, the AI enabled system 102 will provide the details of the most probable reason and also give the details of two more possible reasons in the order of their confidence.
Step 3: Further, for the identified user A, the AI enabled system 102 determines a
tolerance level, for the predicted probable reason. For example, the AI enabled system
102 identifies a second set of parameters on the historical data:
Ex: Worth of the end user,
Numeric value of business value of that end user [ 1 lakh,10lakh] type of the end user [partnerships/trustee etc], Gender [Male/Female], Age group [Youth/middle age], Subscriptions of the end user [ yearly/monthly], How frequently he/she calls the call center [last 1 month], Time of the call,
Last issue status [whether issue was resolved/in progress], End user’s rating to previous issues, Emotion level of user [soft/aggressive]
Considering above parameters, the AI enabled system 102 at least determines that the User A is low tolerant [0 to 0.3] to the reason that the multiple transactions failed using SBI credit card.
Step 4: The AI enabled system 102 may determine an emotional state of the end user A based behavioral nature of the end user and categorizes user A accordingly, using parameters listed below on the historical data to understand if the person is soft spoken/aggressive.
Ex: Gender [Male]
Age [55],
Geographical location [Andhra Pradesh],
Pitch intensity [high],
Based on the above, parameters, the AI enabled system 102 determines that the end
User A is aggressive in nature.
Step 5: The AI enabled system 102 generates a report in real-time based on data on the
of the predicted probable reason for the call, the determined tolerance level of the end
user for said predicted probable reason and the determined emotional state of the end
user. Further, AI enabled system 102 may route said incoming call from the end user to
the appropriate service agent based on the report. Said report can be referred by the
service agent before answering the incoming call.
Ex: Report [User A || reason for call: multiple failed transactions using SBI credit
card, loan processing failed, etc, || Tolerance level: low tolerance [0 to 0.3] || emotional
state: aggressive]
Step 6: Based on said report, the AI enabled system 102 determines department having
such expertise, say “Credit Card Fraud” and selects the service agent as: service agent
X. After right service agent is identified, the incoming call is forwarded or routed to
said service agent of said department based on the report, along with a copy of said
report.
[0038] Fig. 3 discloses an Artificial Intelligence (AI) based method 300 for for
routing an incoming call from an end user to an appropriate service agent. The order in
which the method 300 is described is not intended to be construed as a limitation, and
any number of the described method blocks can be combined in any order to implement
the method 300 or alternate methods. Additionally, individual blocks may be deleted
from the method 300 without departing from the spirit and scope of the subject matter
described herein. Furthermore, the method can be implemented in any suitable
hardware, software, firmware, or combination thereof. However, for ease of
explanation, in the embodiments described below, the method 300 may be considered
to be implemented by the AI enabled system 102.
[0039] At step 302, the method 300 recites identifying the end user based on the incoming call. In one implementation, to identify end user, the method describes mapping contact details of the incoming call with contact details pre-stored in the historical database 214. After mapping, the method describes comparing metadata associated with the incoming call with metadata associated with the mapped contact details from the historical database 214, to extract end user’s identification number. [0040] At step 304, the method recites retrieving at least one of context data and historical data, for the identified end user from a profile database 218. In an exemplary embodiment, the profile database 218 may be configured to store the profiles of plurality of the end users which are the end users of that organization and have called at least once earlier. In one non-limiting example, the retrieved context data is indicative of transaction history of the end user, and the retrieved historical data is at least indicative of demographic details of the end user.
[0041] Further, at step 306, the method describes predicting a probable reason for said call based on one of the retrieved context data, the retrieved historical data and a first set of parameters using the pre-trained AI model 204. In one aspect, though not exclusively disclosed, but the method 300 describe that the AI model 204 is pre-trained to predict the probable reasons by vectorizing the first set of parameters using an encoding technique such as hot encoding and thereafter classifying by using a Support Vector Machine (SVM) classifier, a probable reason from the vectorized set of parameters. Though not exclusively disclosed but in other implementation, the method 300 further discloses if several other end users have been calling for the same reason in past, analysing user profile of said end users. In this case, to predict at least one probable reason for said call, the method describes determining one or more other end users those had called in the past for a reason which is similar to the predicted probable reason. The method further describes analyzing profiles of each of said one or more other end users and said end user and calculating a similarity score between them. The method further describes comparing the calculated similarity score with a predefined threshold. Thus, based on the comparison, if it is determined that the similarity score between said end user profile and the profile of each of said one or more other end users, is greater than the predetermined threshold, the method describes predicting the
at least one probable reason for the incoming call. On the other hand, if it is determined that the similarity score between said end user profile and the profile of each of said one or more other end users is less than the predetermined threshold, all such user profiles of said one or more other end users will not be considered to predict the at least one probable reason for said call.
[0042] Moving ahead, at step 308 the method discloses determining a tolerance level of the identified end user using the pre-trained AI model 204. In an embodiment, said model is pre-trained to determine the tolerance level by classifying a second set of parameters using a decision tree classifier. In one exemplary implementation, though not exclusively disclosed in method steps, the method 300 further comprises determining the tolerance level using the pretrained AI model 204. For this, the method describes calculating an entropy for each one or more parameters selected from the second set of parameters and splitting the one or more parameters into subsets of the one or more parameters based on calculated entropy. Further, the method describes classifying the subset of the one or more parameters, by using the decision tree classifier. Thereafter, based on classification, the method describes determining in real¬time a tolerance score to identify the end user as at least one of highly tolerant [if the tolerance score is between 0.8 to 1], moderately tolerant [if the tolerance score is between 0.4 to 0.7] and low tolerant [if the tolerance score is between 0 to 0.3] to at least one predicted probable reason. It is to be appreciated that said range covers only an exemplary embodiment and the present disclosure is not limited to same. [0043] At step 310, the method describes determining an emotional state of the end user from a user profile of the identified end user using the pre-trained AI model. In an exemplary embodiment, said AI model 204 is pre-trained to determine the emotional state from a user profile of the identified end user. To achieve the same, said model is pre-trained to determine the emotional state by vectorizing a third set of parameters by using the encoding technique and classifying by a Logistic Regression (LR) classifier, an emotional state from the vectorized third set of parameters. Though not exclusively disclosed in method steps but the method 300, in one implementation, to determine the emotional state of the end user using the pre-trained AI model 204, the method
describes identify one or more parameters from the third set of parameters and vectorize said one or more identified parameters using the encoding technique. Further, the method describes classifying using the LR classifier, the vectorized third set of parameters to determine an emotional score and determine the emotional state to identify the end user as at least one of soft spoken or aggressive based on the emotional score.
[0044] At step 312, the method describes generating a report indicating name or identification number of the end user, the predicted probable reason for the call, the determined tolerance level of the end user for said predicted probable reason and the determined emotional state of the end user. In an exemplary embodiment, the method describes generating a report based on data analysis performed on the predicted probable reason for the call, the determined tolerance level of the end user for said predicted probable reason and the determined emotional state of the end user. [0045] Lastly, at step 314, the method describes routing said incoming call from the end user to the appropriate service agent based on said report. In one implementation, for routing the call, the method describes determining, in advance, a department having an expertise in handling the end user’s call. In an exemplary embodiment, the method describes determining the department by matching the at least one probable reason of call indicated in the report with a list of departments available in the call center having expertise in handling matters with same/similar reasons. The method further describes determining a best matched department from the list of departments, based on the matching. Once the department is determined, the method describes selecting the service agent having an expertise in handling the end user’s queries with similar profiles. The incoming call is then forwarded to an appropriate service agent of the department based on the report along with a copy of said report. [0046] In this manner, a department of preferred expertise is selected, and a right service agent may be assigned from said department to facilitate a meaningful conversation between the end user and the service agent and thereby maintain end users’ engagement with an organization.
[0047] The system and method disclosed in the present disclosure can be further understood with help of fig. 4 of the present disclosure. Fig. 4 illustrates a process 400
for routing an incoming call from an end user to an appropriate service agent, in accordance with an embodiment of the present subject matter.
[0048] Fig. 4 shows at block 402, a call being initiated by the end user to get resolution to a problem which is encountered by the end user. In a non- limiting example, the problem may be some challenge with operability of the product/services offered by an organization. At block 404, the end user that has initiated the call, has been identified. In one implementation, the end user is identified by mapping contact details of the incoming call with contact details pre-stored in the user data store. Metadata associated with the incoming call is compared with metadata associated with the mapped contact details from the user data store, to extract end user’s identification (ID) number using a user feature extractor.
[0049] At block 406, the AI enabled system 102 discloses using the end user’s ID to retrieve context data and historical data from databases such as user specific context database, organization’s sale and complaint database, pertaining to the identified end user in order to predict a probable reason for the call. In one aspect, the probable reason for the call may be predicted based on the pre-trained AI model 204 trained using a first set of parameters. In another exemplary embodiment, the AI enabled system 102 may further predict the reason based on social networking posts published by the end user on his/her social networking platforms including Twitter™, Facebook™, LinkedIn™ etc.
[0050] At block 408, in one implementation, the AI enabled system 102 may be configured to predict the probable reason for the call based on the context data and the historical data pertaining to the identified end and the first set of parameters. It may be noted that if several other end users have been calling for the similar reason as predicted by the reason identifier module, the AI enabled system 102 captures that by using collaborative filtering. In other implementation, the context data and the historical data pertaining to the identified end user and the first set of parameters may then be fed to a reason identifier module which may then predict the probable reason for the call by the end user.
[0051] At block 410, in addition to the prediction of the reason for the call, the AI enabled system 102 further determines tolerance level of the end user. In one
implementation, the AI enabled system 102 determines the tolerance level upon training the AI model 204 on a second set of parameters using a tolerance identifier. In a non-limiting example, the tolerance identifier is a topical tolerance identifier. In an aspect, a topical tolerance determining module (not shown) may be configured to determine the emotional state of the end user.
[0052] At block 412, the AI enabled system 102 may further determine an emotional state of the end user to understand whether the end user is soft spoken or aggressive. In one implementation, the AI enabled system 102 determines the emotional state upon training the AI model 204 on a third set of parameters using an emotional state classifier. In other implementation, an emotional state determining module (not shown) may determine the emotional state of the end user.
[0053] At block 414, based on a data analysis performed by the AI enabled system 102 on the first set of parameters, the second set of parameters, and the third set of parameters, the AI enabled system 102 may proactively create a report regarding the incoming call which may then be referred by the service agent before answering the call. Accordingly, the AI enabled system 102 determines whether the call needs to be routed to a service agent of a department having an expertise in handling end user’s queries or reasons mentioned in the report.
[0054] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. [0055] Exemplary embodiments as discussed above may provide certain advantages. These advantages are mentioned here below.
[0056] The embodiments of the present disclosure provide an AI based system and an AI based method for routing an incoming call from an end user to an appropriate service agent.
[0057] The embodiments of the present disclosure provide an AI based system and an AI based method for predicting the reason of the call well before the end user gives out the reason himself/herself explicitly and using that reason to automatically route the call to the appropriate department.
[0058] The embodiments of the present disclosure provide an AI based system and an AI based method for real-time report generation of the end user based on probable reason identification, tolerance level and emotional level understanding of the end user. Said report is augmented with the incoming call to the appropriate service agent for getting into the context without asking for too many details from the end user. This ensures handling emotionally exceptional users to give them good experience. [0059] The embodiments of the present disclosure provide an AI based system and an AI based method to improve end user’s satisfaction, reduce waiting time, and reduce overall resolution time.
[0060] Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope of the disclosed embodiments.
[0061] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer- readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., are non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media. [0062] Suitable processors/controllers include, by way of example, a general-purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in
association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine. [0063] Reference Numerals:
100 Network implementation
102 AI enabled System
104-1,104-2 …104-N User devices
104 User
202 Processor
204 AI model
206 Memory
208 Identification modelling unit
210 Tolerance level modelling unit
212 Emotional state modelling unit
214 Historical database
216 Context database
218 Profile database
220 input/output (I/O) unit
300 Flow diagram of Method
302- 314 Method steps
400 Process
402- 414 Process steps
We claim:
1. An Artificial Intelligence (AI) based method for routing an incoming call from an end user to an appropriate service agent, the method comprising:
identifying (302), by a processor (202), the end user based on the incoming call;
retrieving (304), by the processor (202), at least one of context data and historical data, for the identified end user, from a profile database (218), wherein the retrieved context data is at least indicative of transaction history of the end user, and the retrieved historical data is indicative of demographic details of the end user;
predicting (306), by an identification modelling unit (208), at least one probable reason for said call based on at least one of the retrieved context data, the retrieved historical data and a first set of parameters using a pre-trained AI model (204), wherein said model is pre-trained to predict the probable reasons by:
vectorizing the first set of parameters using an encoding technique; and classifying by a Support Vector Machine (SVM) classifier, a probable
reason from the vectorized first set of parameters;
determining (308) by a tolerance level modelling unit (210), a tolerance level of the identified end user using the pre-trained AI model (204), for the at least one predicted probable reason, wherein said model is pre-trained to determine the tolerance level by classifying a second set of parameters using a decision tree classifier;
determining (310) by an emotional state modelling unit (212), an emotional state of the end user from a user profile of the identified end user, using the pre-trained AI model (204), wherein said model is pre-trained to determine the emotional state by: vectorizing a third set of parameters using the encoding technique; and classifying by a Logistic Regression (LR) classifier, an emotional state
from the vectorized third set of parameters;
generating (312), by the processor (202), a report based on the at least one of the predicted probable reason for the call, the determined tolerance level of the end user for said predicted probable reason and the determined emotional state of the end user; and
routing (314), by the processor (202), said incoming call from the end user to the appropriate service agent based on said report.
2. The method as claimed in claim 1, wherein the context data comprises at least one of recent transaction of the end user, transaction location of the end user, other parties involved in transaction, current status of the transaction, current location of the end user, the transactions which may trigger escalations or calls to the call center, time of the call, tolerance level of specific end user towards a given type of reason, kind of incidents possible with the predicted reason of the incoming call and like parameters, and wherein the historical data comprises at least one of worth of the end user, type of the end user, gender, age, yearly or monthly subscriptions of the end user, how frequently he/she calls the call center, time of the call, current issue, last issue status such as whether issue was resolved or in progress, emotional state of the end user, time sensitivity of the end user, actual location of the end user and like parameters.
3. The method as claimed in claim 1, wherein the first set of parameters include at least one of a recent service used by the end user, recent transaction of the end user, transaction location of the end user, other parties involved in transaction, current status of the transaction, current location of the end user, and data from previous communications such as chat data from chatbots or text data from mail threads along with audio, video call interactions, social networking posts published by the end users related to the service, organization’s sale and complaint data associated to a plurality of the end users, and wherein the second set of parameters include at least one of worth of the end user, type of the end user, gender, age, yearly or monthly subscriptions of the end user, how frequently the end user calls the call center, time of the call, current issue, last issue status such as whether issue was resolved or in progress, and end user rating to previous issues, and wherein the third set of parameters include at least one of pitch intensity, gender, age, geographical location of the end user, and historical communication data of the end user along with a service agent’s profile that handled such communications.
4. The method as claimed in claim 1, wherein predicting (306) the at least one probable reason for said call, using said pre-trained AI model (204), further comprising:
identifying, by the identification modelling unit (208), one or more parameters from the retrieved context data and the retrieved historical data;
vectorizing, by the identification modelling unit (208), said one or more identified parameters from the retrieved context data and the retrieved historical data of the identified end user, and the first set of parameters using the encoding technique; and
classifying, by the identification modelling unit (208), using the SVM classifier, to predict the at least one probable reason for the said call from the vectorized set of parameters.
5. The method as claimed in claim 1, wherein predicting (306) at least one probable reason
for said call, using said pre-trained AI model (204), further comprising:
determining, by the identification modelling unit (208), based on the retrieved context data, the retrieved historical data, one or more other end users those have called for a reason similar to the predicted probable reason, in past; and
analyzing, by the identification modelling unit (208), user profiles of each of said one or more other end users and said end user and calculating a similarity score between them;
comparing, by the identification modelling unit (208), the calculated similarity score with a predefined threshold; and
predicting, by the identification modelling unit (208), the at least one probable reason for said call based on the comparison, if it is determined that the similarity score between said end user profile and the one or more other end user profiles, is greater than the predetermined threshold.
6. The method as claimed in claim 1, wherein determining (308) the tolerance level of the
identified end user, using said pre-trained AI model (204) comprises:
calculating, by the tolerance level modelling unit (210), entropy for each one or more parameters selected from the second set of parameters, and splitting the one or more parameters into subsets based on calculated entropy;
classifying, by the tolerance level modelling unit (210), using the decision tree classifier, the subsets of the one or more parameters based on splitting; and
determining, by the tolerance level modelling unit (210), a tolerance score based on classification to identify the end user as at least one of highly tolerant, moderately tolerant and low tolerant.
7. The method as claimed in claim 1, wherein determining (310) the emotional state of
the end user from the user profile, using said pre-trained AI model (204) further
comprising:
identifying, by the emotional state modelling unit (212), one or more parameters from the user profile;
vectorizing, by the emotional state modelling unit (212), said one or more identified parameters using the encoding technique; and
classifying, by the emotional state modelling unit (212), using the LR classifier, the vectorized set of one or more identified parameters to determine an emotional score; and
determining, by the emotional state modelling unit (212), the emotional state to identify the end user as at least one of soft spoken or aggressive, based on the emotional score.
8. The method as claimed in claim 1, wherein routing (314) said incoming call from the
end user to the appropriate service agent based on the report, comprising:
determining, by the at least one processor (202), a department having an expertise in handling the end user’s call;
selecting, by the at least one processor (202), the service agent having an expertise in handling the end user’s call with similar profiles; and
forwarding, by the at least one processor (202), said incoming call to an appropriate service agent of the department based on the report along with a copy of said report.
9. An Artificial Intelligence (AI) based system (102) for routing an incoming call from an
end user to an appropriate service agent, the system (102) comprises:
a processor (202) configured to:
identify the end user based on the incoming call;
retrieve at least one of context data and historical data, for the identified end user, from a profile database (218), wherein the retrieved context data is at least indicative of transaction history of the end user, and the retrieved historical data is at least indicative of demographic details of the end user;
an identification modelling unit (208) coupled to said processor (202) and configured to predict at least one probable reason for said call based on at least one of the retrieved context data, the retrieved historical data and a first set of parameters using a pre-trained AI model (204), wherein said model is pre-trained to predict the probable reasons for the call by:
vectorizing the first set of parameters using an encoding technique; and classifying by a Support Vector Machine (SVM) classifier a probable reason from the vectorized first set of parameters;
a tolerance level modelling unit (210) coupled to said processor (202) and configured to determine a tolerance level of the identified end user using the pre-trained AI model (204), for the at least one predicted probable reason, wherein said model is pre-trained to determine the tolerance level by classifying a second set of parameters using a decision tree classifier; and
an emotional state modelling unit (212) coupled to said processor (202) and configured to determine an emotional state of the end user from a user profile of the identified end user, using the pre-trained AI model (204), wherein said model is pre-trained to determine the emotional state by:
vectorizing a third set of parameters using the encoding technique; and classifying by a Logistic Regression (LR) classifier, an emotional state from the vectorized third set of parameters; wherein the processor (202) is further configured to:
generate a report based on the at least one of the predicted probable reason for the call, the determined tolerance level of the end user for said predicted probable reason and the determined emotional state of the end user; and
route said incoming call from the end user to the appropriate service agent based on the report. 10. The system as claimed in claim 9, wherein to predict the at least one probable reason for said call, using said pre-trained AI model (204), the identification modelling unit (208) is configured to:
identify one or more parameters from the retrieved context data and the retrieved historical data;
vectorize said one or more identified parameters from the retrieved context data and the retrieved historical data of the identified end user, and the first set of parameters using the encoding technique; and
classify using the SVM classifier, to predict at the least one probable reason for the said call from the vectorized set of parameters.
11. The system as claimed in claim 9, wherein to predict at least one probable reason for
said call, using said pre-trained AI model (204), the identification modelling unit (208)
is further configured to:
determine, based on the retrieved context data, the retrieved historical data, one or more other end users those have called for a reason similar to the predicted probable reason, in past; and
analyze user profiles of each of said one or more other end users and said end user and calculating a similarity score between them;
compare the calculated similarity score with a predefined threshold; and
predict the at least one probable reason for said call based on the comparison, if it is determined that the similarity score between said end user profile and the one or more other end user profiles, is greater than the predetermined threshold.
12. The system as claimed in claim 9, wherein to determine the tolerance level of the
identified end user, using said pre-trained AI model, (204) the tolerance level modelling
unit (210) is configured to:
calculate entropy for each one or more parameters selected from the second set of parameters, and split the one or more parameters into subsets based on calculated entropy;
classify using the decision tree classifier, the subsets of the one or more parameters based on splitting; and
determine a tolerance score based on classification to identify the end user as at least one of highly tolerant, moderately tolerant and low tolerant.
13. The system as claimed in claim 9, wherein to determine the emotional state of the end
user from the user profile, the emotional state modelling unit (212) is configured to:
identify one or more parameters from the user profile;
vectorize said one or more identified parameters using the encoding technique; and
classify using the LR classifier, the vectorized set of one or more identified parameters to determine an emotional score; and
determine the emotional state to identify the end user as at least one of soft spoken or aggressive, based on the emotional score. 14. The system as claimed in claim 8, wherein the processor while routing said incoming call from the end user to the appropriate service agent based on the report, the processor (202) is configured to:
determine a department having an expertise in handling the end user’s call;
select the service agent having an expertise in handling the end user’s call with similar profiles; and
forward said incoming call to an appropriate service agent of the department based on the report along with a copy of said report.
| # | Name | Date |
|---|---|---|
| 1 | 202021038486-Correspondence to notify the Controller [07-03-2025(online)].pdf | 2025-03-07 |
| 1 | 202021038486-Response to office action [08-06-2023(online)].pdf | 2023-06-08 |
| 1 | 202021038486-STATEMENT OF UNDERTAKING (FORM 3) [07-09-2020(online)].pdf | 2020-09-07 |
| 1 | 202021038486-Written submissions and relevant documents [26-03-2025(online)].pdf | 2025-03-26 |
| 2 | 202021038486-PROVISIONAL SPECIFICATION [07-09-2020(online)].pdf | 2020-09-07 |
| 2 | 202021038486-FORM-26 [07-03-2025(online)].pdf | 2025-03-07 |
| 2 | 202021038486-Correspondence to notify the Controller [07-03-2025(online)].pdf | 2025-03-07 |
| 2 | 202021038486-CLAIMS [10-03-2023(online)].pdf | 2023-03-10 |
| 3 | 202021038486-COMPLETE SPECIFICATION [10-03-2023(online)].pdf | 2023-03-10 |
| 3 | 202021038486-FORM-26 [07-03-2025(online)].pdf | 2025-03-07 |
| 3 | 202021038486-POWER OF AUTHORITY [07-09-2020(online)].pdf | 2020-09-07 |
| 3 | 202021038486-US(14)-HearingNotice-(HearingDate-11-03-2025).pdf | 2025-02-11 |
| 4 | 202021038486-DRAWING [10-03-2023(online)].pdf | 2023-03-10 |
| 4 | 202021038486-FORM 1 [07-09-2020(online)].pdf | 2020-09-07 |
| 4 | 202021038486-Response to office action [08-06-2023(online)].pdf | 2023-06-08 |
| 4 | 202021038486-US(14)-HearingNotice-(HearingDate-11-03-2025).pdf | 2025-02-11 |
| 5 | 202021038486-Response to office action [08-06-2023(online)].pdf | 2023-06-08 |
| 5 | 202021038486-FER_SER_REPLY [10-03-2023(online)].pdf | 2023-03-10 |
| 5 | 202021038486-DRAWINGS [07-09-2020(online)].pdf | 2020-09-07 |
| 5 | 202021038486-CLAIMS [10-03-2023(online)].pdf | 2023-03-10 |
| 6 | 202021038486-OTHERS [10-03-2023(online)].pdf | 2023-03-10 |
| 6 | 202021038486-DECLARATION OF INVENTORSHIP (FORM 5) [07-09-2020(online)].pdf | 2020-09-07 |
| 6 | 202021038486-COMPLETE SPECIFICATION [10-03-2023(online)].pdf | 2023-03-10 |
| 6 | 202021038486-CLAIMS [10-03-2023(online)].pdf | 2023-03-10 |
| 7 | 202021038486-Proof of Right [27-01-2021(online)].pdf | 2021-01-27 |
| 7 | 202021038486-FER.pdf | 2022-09-13 |
| 7 | 202021038486-DRAWING [10-03-2023(online)].pdf | 2023-03-10 |
| 7 | 202021038486-COMPLETE SPECIFICATION [10-03-2023(online)].pdf | 2023-03-10 |
| 8 | 202021038486-DRAWING [10-03-2023(online)].pdf | 2023-03-10 |
| 8 | 202021038486-DRAWING [21-07-2021(online)].pdf | 2021-07-21 |
| 8 | 202021038486-FER_SER_REPLY [10-03-2023(online)].pdf | 2023-03-10 |
| 8 | 202021038486-FORM 18 [13-05-2022(online)].pdf | 2022-05-13 |
| 9 | 202021038486-CORRESPONDENCE-OTHERS [21-07-2021(online)].pdf | 2021-07-21 |
| 9 | 202021038486-FER_SER_REPLY [10-03-2023(online)].pdf | 2023-03-10 |
| 9 | 202021038486-OTHERS [10-03-2023(online)].pdf | 2023-03-10 |
| 9 | Abstract1.jpg | 2022-02-09 |
| 10 | 202021038486-COMPLETE SPECIFICATION [21-07-2021(online)].pdf | 2021-07-21 |
| 10 | 202021038486-FER.pdf | 2022-09-13 |
| 10 | 202021038486-OTHERS [10-03-2023(online)].pdf | 2023-03-10 |
| 11 | 202021038486-CORRESPONDENCE-OTHERS [21-07-2021(online)].pdf | 2021-07-21 |
| 11 | 202021038486-FER.pdf | 2022-09-13 |
| 11 | 202021038486-FORM 18 [13-05-2022(online)].pdf | 2022-05-13 |
| 11 | Abstract1.jpg | 2022-02-09 |
| 12 | 202021038486-DRAWING [21-07-2021(online)].pdf | 2021-07-21 |
| 12 | 202021038486-FORM 18 [13-05-2022(online)].pdf | 2022-05-13 |
| 12 | Abstract1.jpg | 2022-02-09 |
| 13 | 202021038486-COMPLETE SPECIFICATION [21-07-2021(online)].pdf | 2021-07-21 |
| 13 | 202021038486-FER.pdf | 2022-09-13 |
| 13 | 202021038486-Proof of Right [27-01-2021(online)].pdf | 2021-01-27 |
| 13 | Abstract1.jpg | 2022-02-09 |
| 14 | 202021038486-OTHERS [10-03-2023(online)].pdf | 2023-03-10 |
| 14 | 202021038486-DECLARATION OF INVENTORSHIP (FORM 5) [07-09-2020(online)].pdf | 2020-09-07 |
| 14 | 202021038486-CORRESPONDENCE-OTHERS [21-07-2021(online)].pdf | 2021-07-21 |
| 14 | 202021038486-COMPLETE SPECIFICATION [21-07-2021(online)].pdf | 2021-07-21 |
| 15 | 202021038486-CORRESPONDENCE-OTHERS [21-07-2021(online)].pdf | 2021-07-21 |
| 15 | 202021038486-DRAWING [21-07-2021(online)].pdf | 2021-07-21 |
| 15 | 202021038486-DRAWINGS [07-09-2020(online)].pdf | 2020-09-07 |
| 15 | 202021038486-FER_SER_REPLY [10-03-2023(online)].pdf | 2023-03-10 |
| 16 | 202021038486-DRAWING [10-03-2023(online)].pdf | 2023-03-10 |
| 16 | 202021038486-DRAWING [21-07-2021(online)].pdf | 2021-07-21 |
| 16 | 202021038486-FORM 1 [07-09-2020(online)].pdf | 2020-09-07 |
| 16 | 202021038486-Proof of Right [27-01-2021(online)].pdf | 2021-01-27 |
| 17 | 202021038486-COMPLETE SPECIFICATION [10-03-2023(online)].pdf | 2023-03-10 |
| 17 | 202021038486-DECLARATION OF INVENTORSHIP (FORM 5) [07-09-2020(online)].pdf | 2020-09-07 |
| 17 | 202021038486-POWER OF AUTHORITY [07-09-2020(online)].pdf | 2020-09-07 |
| 17 | 202021038486-Proof of Right [27-01-2021(online)].pdf | 2021-01-27 |
| 18 | 202021038486-CLAIMS [10-03-2023(online)].pdf | 2023-03-10 |
| 18 | 202021038486-DECLARATION OF INVENTORSHIP (FORM 5) [07-09-2020(online)].pdf | 2020-09-07 |
| 18 | 202021038486-PROVISIONAL SPECIFICATION [07-09-2020(online)].pdf | 2020-09-07 |
| 18 | 202021038486-DRAWINGS [07-09-2020(online)].pdf | 2020-09-07 |
| 19 | 202021038486-Response to office action [08-06-2023(online)].pdf | 2023-06-08 |
| 19 | 202021038486-STATEMENT OF UNDERTAKING (FORM 3) [07-09-2020(online)].pdf | 2020-09-07 |
| 19 | 202021038486-FORM 1 [07-09-2020(online)].pdf | 2020-09-07 |
| 19 | 202021038486-DRAWINGS [07-09-2020(online)].pdf | 2020-09-07 |
| 20 | 202021038486-US(14)-HearingNotice-(HearingDate-11-03-2025).pdf | 2025-02-11 |
| 20 | 202021038486-POWER OF AUTHORITY [07-09-2020(online)].pdf | 2020-09-07 |
| 20 | 202021038486-FORM 1 [07-09-2020(online)].pdf | 2020-09-07 |
| 21 | 202021038486-POWER OF AUTHORITY [07-09-2020(online)].pdf | 2020-09-07 |
| 21 | 202021038486-FORM-26 [07-03-2025(online)].pdf | 2025-03-07 |
| 21 | 202021038486-PROVISIONAL SPECIFICATION [07-09-2020(online)].pdf | 2020-09-07 |
| 22 | 202021038486-Correspondence to notify the Controller [07-03-2025(online)].pdf | 2025-03-07 |
| 22 | 202021038486-PROVISIONAL SPECIFICATION [07-09-2020(online)].pdf | 2020-09-07 |
| 22 | 202021038486-STATEMENT OF UNDERTAKING (FORM 3) [07-09-2020(online)].pdf | 2020-09-07 |
| 23 | 202021038486-STATEMENT OF UNDERTAKING (FORM 3) [07-09-2020(online)].pdf | 2020-09-07 |
| 23 | 202021038486-Written submissions and relevant documents [26-03-2025(online)].pdf | 2025-03-26 |
| 1 | SearchHistoryE_13-09-2022.pdf |