Abstract: ABSTRACT A SYSTEM AND METHOD FOR EVALUATING A CLIENT ENGAGEMENT The present disclosure envisages a field of evaluating client engagement. The system (100) for evaluating a client engagement comprises a repository (102), an assessment module (104), a first summation unit (106), a second summation unit (108) and an identifier (110). The set of questions are displayed from the prospective questions corresponding to a user ID of a first user associated with a first communication device (112) to receive first set of inputs. The second set of inputs, received from second communication device (114), corresponding to the attributes stored in repository (102) and the third set of inputs corresponding to the metrics are received. The first and the third inputs are aggregated to generate first aggregated value which is then aggregated with the summation of second set of inputs to compute a score value that identifies the engagement level. The system (100) offers different parameters for the evaluation of the engagement.
DESC:FIELD
The present invention relates to the field of evaluating client engagement.
BACKGROUND
The background information herein below relates to the present disclosure but is not necessarily prior art.
When a company provides services to a client, the relationship between the company and the client is known as an engagement. There can be as many engagements in a company, as there are clients. When an engagement initiates, the engagement may get nurtured, or become stagnant or can grow sporadically. Many engagements fail to grow too. There are reasons which are unknown to the company, as to why some engagements work, and some fail miserably.
Currently, there is no benchmark to evaluate where the engagement is heading. Further, the team which manages engagements does not have any method or any automated scientific/technological basis for understanding the reasons behind the growth or failure of engagements.
Hence, the current scenario lacks any yardstick to measure the relationship of a company with its client. Moreover, there is a need to evaluate the major areas of concern in a client portfolio, where the company or the team directly dealing with the client, can highlight or prioritize the areas and then work upon it, to have a steady growth with the client. Further, the company must be aware of the creamy layer of clients, with whom the company can have better growth prospects and are worth profitable
There is, therefore, felt a need for a system and method to evaluate a client engagement so as to alleviate the drawbacks of the prior art.
OBJECTS
Some of the objects of the present disclosure, which at least one embodiment herein satisfies, are as follows:
It is an object of the present disclosure to ameliorate one or more problems of the prior art or to at least provide a useful alternative.
An object of the present disclosure is to provide a system to evaluate client engagement.
Another object of the present disclosure is to provide a system that offers different parameters for the evaluation of the engagement.
Yet another object of the present disclosure is to provide a system the provides different questions to different users, and analyse the received responses to ask the subsequent question as to not repeat the same question.
Other objects and advantages of the present disclosure will be more apparent from the following description, which is not intended to limit the scope of the present disclosure.
SUMMARY
The present disclosure envisages a system to evaluate a client engagement. The system comprises a repository, an assessment module, a first summation unit, a second summation unit and an identifier.
The repository is configured to store a pre-determined conversion rules, a first look up table having a list of prospective questions and at least one user ID corresponding to the question, a second look up table having a list of metrics and a scale corresponding to each of the metric, a third look up table having a list of engagement levels and a range of values corresponding to each engagement level, a fourth look up table having a list of responses and a value corresponding to each of the response and a fifth look up table having a list of parameters and a plurality of attributes corresponding to each of the parameter.
In an embodiment, a NLP module is configured to generate the prospective questions based on a pre-stored set of periodically updated keywords and a set of pre-determined natural language processing rules that are pre-stored in the repository.
In an embodiment, the first and the third set of inputs pertain to a set of parameters selected from the group corresponding to contractual factors, strategical factors, tactical factors and operational factors, and the second set of inputs pertain to the parameters, based on the fifth look up table, selected from the group corresponding to factors based on business and customer, and factors corresponding to people.
The assessment module is configured to cooperate with the repository to:
• display a set of questions from the prospective questions corresponding to a user ID of a first user associated with a first communication device, and is further configured to receive a first set of responses for the set of questions, wherein a question from the set of questions is selected based on analysis of historical responses provided by the user for displayed questions;
• display the attributes on the first communication device to receive a second set of inputs corresponding to the attributes based on the fifth look up table; and
• display the metrics on a second communication device associated with a second user and receive a third set of inputs corresponding to the metrics attributes based on the second look up table.
The assessment module includes a first crawler and extractor, an analyser, a second crawler and extractor and a third crawler and extractor.
The first crawler and extractor is configured to crawl through the first look up table to extract the question corresponding to the user ID. The analyser is configured to cooperate with the first communication device to receive the response, and is further configured to cooperate with the first crawler and extractor to analyse the at least one historical response corresponding to the displayed questions and select the subsequent question based on the first look up table and the analysis. The second crawler and extractor is configured to crawl through the second look up table to extract the metrics. The third crawler and extractor is configured to crawl through the fifth look up table to extract the attributes corresponding to the parameters.
The assessment module is configured to cooperate with the repository to:
• display a set of questions from the prospective questions corresponding to a user ID of a first user associated with a first communication device, and is further configured to receive a first set of responses for the set of questions, wherein a question from the set of questions is selected based on analysis of historical responses provided by the user for displayed questions;
• display the attributes on the first communication device to receive a second set of inputs corresponding to the attributes based on the fifth look up table; and
• display the metrics on a second communication device associated with a second user and receive a third set of inputs corresponding to the metrics attributes based on the second look up table.
The assessment module includes a first crawler and extractor, an analyser, a second crawler and extractor and a third crawler and extractor.
The first crawler and extractor is configured to crawl through the first look up table to extract the question corresponding to the user ID. The analyser is configured to cooperate with the first communication device to receive the response, and is further configured to cooperate with the first crawler and extractor to analyse the at least one historical response corresponding to the displayed questions and select the subsequent question based on the first look up table and the analysis. The second crawler and extractor is configured to crawl through the second look up table to extract the metrics. The third crawler and extractor is configured to crawl through the fifth look up table to extract the attributes corresponding to the parameters.
The second aggregator is configured to cooperate with the first aggregator and the first summation unit to aggregate the first aggregated value with the summation of the second set of inputs to compute the score value.
In an embodiment, the pre-determined conversion rules are based on machine learning and big data analysis.
In an embodiment, the second summation unit performs conversion to remove at least one outlier to produce an accurate analysis in case of discrepancies.
The identifier is configured to cooperate with the second summation unit and the repository to identify the engagement level of the client based on the computed score value and the third look up table.
The assessment module, the first summation unit, the second summation unit, and the identifier are implemented using one or more processor(s).
The present disclosure envisages a method for evaluating client engagement.
The method includes the following steps:
• storing, by a repository, a pre-determined conversion rules, a first look up table having a list of prospective questions and at least one user ID corresponding to the question, a second look up table having a list of metrics and a scale corresponding to each of the metric, a third look up table having a list of engagement levels and a range of values corresponding to each engagement level, a fourth look up table having a list of responses and a value corresponding to each of the response and a fifth look up table having a list of parameters and a plurality of attributes corresponding to each of the parameter;
• displaying, by an assessment module, a set of questions from the prospective questions corresponding to a user ID of a first user associated with a first communication device;
• receiving, by the assessment module, a first set of responses for the set of questions, wherein a question from the set of questions is selected based on analysis of historical responses provided by the user for displayed questions;
• displaying, by the assessment module, the attributes on the first communication device to receive a second set of inputs corresponding to the attributes;
• displaying, by the assessment module, the metrics on a second communication device associated with a second user and receive a third set of inputs corresponding to the metrics;
• computing, by a first summation unit, summation of the second set of inputs;
• converting, by a second summation unit, the third set of inputs into a set of scores based on the scale corresponding to each of the metric using the pre-determined conversion rules;
• extracting, by the second summation unit, the value for each of the response in first set of responses based on the fourth look up table;
• aggregating, by the second summation unit the extracted values and the set of scores to generate the first aggregated value;
• aggregating, by the second summation unit, the first aggregated value with the summation of second set of inputs to compute a score value; and
• identifying, by an identifier, the engagement level of the client based on the computed score value and the third look up table.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWING
A system and method to evaluate client engagement of the present disclosure will now be described with the help of the accompanying drawing, in which:
Figure 1 illustrates a block diagram of to evaluate client engagement; and
Figure 2a, 2b and 2c illustrate a flow diagram of a method for evaluating client engagement.
LIST OF REFERENCE NUMERALS
100 System
102 repository
104 assessment module
106 first summation unit
108 second summation unit
110 identifier
112 first communication device
114 second communication device
116 first crawler and extractor
118 analyser
120 second crawler and extractor
122 third crawler and extractor
124 fourth crawler and extractor
126 conversion unit
128 fifth crawler and extractor
130 first aggregator
132 second aggregator
134 Communication unit
136 NLP module
DETAILED DESCRIPTION
Embodiments, of the present disclosure, will now be described with reference to the accompanying drawing.
Embodiments are provided so as to thoroughly and fully convey the scope of the present disclosure to the person skilled in the art. Numerous details, are set forth, relating to specific components, and methods, to provide a complete understanding of embodiments of the present disclosure. It will be apparent to the person skilled in the art that the details provided in the embodiments should not be construed to limit the scope of the present disclosure. In some embodiments, well-known processes, well-known apparatus structures, and well-known techniques are not described in detail.
The terminology used, in the present disclosure, is only for the purpose of explaining a particular embodiment and such terminology shall not be considered to limit the scope of the present disclosure. As used in the present disclosure, the forms "a,” "an," and "the" may be intended to include the plural forms as well, unless the context clearly suggests otherwise. The terms "comprises," "comprising," “including,” and “having,” are open ended transitional phrases and therefore specify the presence of stated features, elements, modules, units and/or components, but do not forbid the presence or addition of one or more other features, elements, components, and/or groups thereof. The particular order of steps disclosed in the method of the present disclosure is not to be construed as necessarily requiring their performance as described or illustrated. It is also to be understood that additional or alternative steps may be employed.
A system and method to evaluate client engagement of the present disclosure, is described with reference to Figure 1 through Figure 2c.
The present disclosure envisages a system to evaluate client engagement that evaluates a relationship of a company with a client, which is called as an “engagement”. Further, the system of the present disclosure generate three results related to business, client and people. In relation to the client, an engagement level associated with each of the client is generated as an outcome that depicts the level of growth of the engagement. The results are displayed at the display unit for the managers and the senior level managers, to assess the results.
Referring to Figure 1, the system to evaluate client engagement (hereinafter referred as “system”) (100) comprises a repository (102), an assessment module (104), a first summation unit (106), a second summation unit (108) and an identifier (110).
The repository (102) is configured to store a pre-determined conversion rules, a first look up table having a list of prospective questions and at least one user ID corresponding to the question, a second look up table having a list of metrics and a scale corresponding to each of the metric, a third look up table having a list of engagement levels and a range of values corresponding to each engagement level, a fourth look up table having a list of responses and a value corresponding to each of the response and a fifth look up table having a list of parameters and a plurality of attributes corresponding to each of the parameter.
In an embodiment, a NLP module (136) is configured to generate the prospective questions based on a pre-stored set of periodically updated keywords and a set of pre-determined natural language processing rules that are pre-stored in the repository (102).
In another embodiment, the questions can be selected based on the role of the user, their job roles, region they belong to, and industry they work in.
In an embodiment, the first and the third set of inputs pertain to a set of parameters selected from the group corresponding to contractual factors, strategical factors, tactical factors and operational factors, and the second set of inputs pertain to the parameters, based on the fifth look up table, selected from the group corresponding to factors based on business and customer, and factors corresponding to people.
Table 1 illustrates the fifth look up table.
Parameters Attribute
Applicability and Usefulness Focus and Applicability
Authenticity
Classification
Assessment Trends
Goals setting
Benchmarking
Assurance
(Table 1)
The assessment module (104) is configured to cooperate with the repository (102) to:
• display a set of questions from the prospective questions corresponding to a user ID of a first user associated with a first communication device (112), and is further configured to receive a first set of responses for the set of questions, wherein a question from the set of questions is selected based on analysis of historical responses provided by the user for displayed questions;
• display the attributes on the first communication device (112) to receive a second set of inputs corresponding to the attributes based on the fifth look up table; and
• display the metrics on a second communication device (114) associated with a second user and receive a third set of inputs corresponding to the metrics attributes based on the second look up table.
In an embodiment, the first user and second user are managers.
The assessment module (104) includes a first crawler and extractor (116), an analyser (118), a second crawler and extractor (120) and a third crawler and extractor (122).
The first crawler and extractor (116) is configured to crawl through the first look up table to extract the question corresponding to the user ID. The analyser (118) is configured to cooperate with the first communication device (112) to receive the response, and is further configured to cooperate with the first crawler and extractor (116) to analyse the at least one historical response corresponding to the displayed questions and select the subsequent question based on the first look up table and the analysis. The second crawler and extractor (120) is configured to crawl through the second look up table to extract the metrics. The third crawler and extractor (122) is configured to crawl through the fifth look up table to extract the attributes corresponding to the parameters.
In an embodiment, the question can be re-framed to check whether the user has understood the questions correctly to check the user’s degree of correctness of the response.
In an embodiment, the assessment module (104) perform the assessment based on natural language processing which identifies contextual patterns for fine tuning and selecting the questions.
In an embodiment, the system (100) includes a communication unit (134) configured to cooperate with the first crawler and extractor (116), the second crawler and extractor (120) and the third crawler and extractor (122) to:
• transmit the extracted questions and the subsequent questions, and the extracted attributes to the first communication device (112);
• transmit the extracted metrics to the second communication device (114); and
• receive the first set of responses, the second set of inputs and the third set of inputs.
The second summation unit (108) is configured to cooperate with the assessment module (104), the repository (102) and the first summation unit (106) to:
• convert the third set of inputs into a set of scores based on the scale corresponding to each of the metric using the pre-determined conversion rules;
• extract the value for each of the response in first set of responses based on the fourth look up table;
• aggregate the extracted values and the set of scores to generate the first aggregated value; and
• aggregate the first aggregated value with the summation of second set of inputs to compute a score value.
The second summation unit (108) includes a fourth crawler and extractor (124), a conversion unit (126), a fifth crawler and extractor (128), a first aggregator (130) and a second aggregator (132).
The fourth crawler and extractor (124) is configured to crawl through the second look up table to extract the scale corresponding to the metric. The conversion unit (126) is configured to cooperate with the fourth crawler and extractor (124) to convert the third set of inputs into the set of scores based on the scale and the pre-determined conversion rules. The fifth crawler and extractor (128) is configured to crawl through the fourth look up table to extract the value for each of the response in first set of responses. The first aggregator (130) is configured to cooperate with the conversion unit (126) and the fifth crawler and extractor (128) to aggregate the extracted values and the set of scores to generate the first aggregated value.
Table 2 illustrates the generation of first aggregated value.
Metrics Third set of inputs Scale Converted Score
CSAT 4.4 1-5 88
SLA% 90% 100% 90
Actual Margin 40% 70% 57
Span Control 3 1-6 50
Attrition% 10% 0-20% 90
Table 3 illustrates the generation of first aggregated value.
Parameters Attribute Extracted values set of scores Average
Method Robustness 38 75 56.5
Holistic 10 50 30
average value 43.3
Execution Deployment 35 80 57.5
Systematic 55 75 65
average value 61.3
Appraise and Improve Measures and Metrics 38 60 49
Understand and Explore 45 47 46
Enhancement and Innovation 45 75 60
average value 51.7
first aggregated value 52.1
(Table 3)
The second aggregator (132) is configured to cooperate with the first aggregator (130) and the first summation unit (106) to aggregate the first aggregated value with the summation of the second set of inputs to compute the score value.
In an embodiment, the pre-determined conversion rules are based on machine learning and big data analysis.
In an embodiment, the second summation unit (108) performs conversion to remove at least one outlier to produce an accurate analysis in case of discrepancies.
The identifier (110) is configured to cooperate with the second summation unit (108) and the repository (102) to identify the engagement level of the client based on the computed score value and the third look up table.
Table 4 illustrates the third look up table to compute the engagement level
Engagement Level Range of scores
Nascent >= 0 & <= 90
Nurture >= 91 & <= 210
Blossom >= 211 & <= 360
Radiate >= 361 & <= 510
Mature >= 511 & <= 540
(Table 4)
The assessment module (104), the first summation unit (106), the second summation unit (108), and the identifier (110) are implemented using one or more processor(s).
The processors may be general-purpose processors, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), and/or the like. The processors may be configured to retrieve data from and/or write data to a memory/repository. The memory/repository can be for example, a random access memory (RAM), a memory buffer, a hard drive, a database, an erasable programmable read only memory (EPROM), an electrically erasable programmable read only memory (EEPROM), a read only memory (ROM), a flash memory, a hard disk, a floppy disk, cloud storage, and/or so forth.
In an operative embodiment, a set of questions are displayed from the prospective questions corresponding on a first communication device (112) associated with a firs user. The set of questions are displayed based on the user ID. Further, the prospective questions are generated based on a pre-stored set of periodically updated keywords and a set of pre-determined natural language processing rules that are pre-stored in the repository (102). The subsequent question from the set of questions is selected based on analysis of the first set of responses received for the set of questions, provided by the user, which ensures that the questions are not repeated and are asked on analysis of the previous responses. Further, a second set of inputs corresponding to the attributes for at least one stored parameter is received from first communication device (112) from the first user. And, a third set of inputs corresponding to the metrics attributes displayed on a second communication device (114) associated with a second user is received. The third set of inputs are in different format, like whole number, percentage and decimal to name a few. So the third set of inputs are converted to a common format based on the scale provided in the repository (102) and a set of scores are generated. The summation of second set of inputs is computed. The values for the first set of responses are extracted from the repository (102). The responses could be, for example, strongly agree, agree, neutral, disagree and strongly disagree. The values for the responses can be 1,2,5,8 and 10. The respective values are aggregated with the with set of scores to generate a first aggregated value. The first aggregated value is aggregated with summation of second set of inputs to compute a score value. The engagement level of the client is identified based on the computed score value.
Figures 2a, 2b and 2c illustrate a flow diagram of a method to evaluate client engagement. The method includes the following steps:
• Step 202: storing, by a repository (102), a pre-determined conversion rules, a first look up table having a list of prospective questions and at least one user ID corresponding to the question, a second look up table having a list of metrics and a scale corresponding to each of the metric, a third look up table having a list of engagement levels and a range of values corresponding to each engagement level, a fourth look up table having a list of responses and a value corresponding to each of the response and a fifth look up table having a list of parameters and a plurality of attributes corresponding to each of the parameter;
• Step 204: displaying, by an assessment module (104), a set of questions from the prospective questions corresponding to a user ID of a first user associated with a first communication device (112);
• Step 206: receiving, by the assessment module (104), a first set of responses for the set of questions, wherein a question from the set of questions is selected based on analysis of historical responses provided by the user for displayed questions;
• Step 208: displaying, by the assessment module (104), the attributes on the first communication device (112) to receive a second set of inputs corresponding to the attributes;
• Step 210: displaying, by the assessment module (104), the metrics on a second communication device (114) associated with a second user and receive a third set of inputs corresponding to the metrics;
• Step 212: computing, by a first summation unit (106), summation of the second set of inputs;
• Step 214: converting, by a second summation unit (108), the third set of inputs into a set of scores based on the scale corresponding to each of the metric using the pre-determined conversion rules;
• Step 216: extracting, by the second summation unit (108), the value for each of the response in first set of responses based on the fourth look up table;
• Step 218: aggregating, by the second summation unit (108), the extracted values and the set of scores to generate the first aggregated value;
• Step 220: aggregating, by the second summation unit (108), the first aggregated value with the summation of second set of inputs to compute a score value; and
• Step 222: identifying, by an identifier (110), the engagement level of the client based on the computed score value and the third look up table.
The foregoing description of the embodiments has been provided for purposes of illustration and not intended to limit the scope of the present disclosure. Individual components of a particular embodiment are generally not limited to that particular embodiment, but, are interchangeable. Such variations are not to be regarded as a departure from the present disclosure, and all such modifications are considered to be within the scope of the present disclosure.
TECHNICAL ADVANCEMENTS
The present disclosure described herein above has several technical advantages including, but not limited to, the realization of, a system and method to evaluate client engagement, that:
• offers different parameters for the evaluation of the engagement; and
• provides different questions to different users, and analyse the received responses to ask the subsequent question as to not repeat the same question.
The foregoing disclosure has been described with reference to the accompanying embodiments which do not limit the scope and ambit of the disclosure. The description provided is purely by way of example and illustration.
The embodiments herein and the various features and advantageous details thereof are explained with reference to the non-limiting embodiments in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
The foregoing description of the specific embodiments so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
Throughout this specification the word “comprise”, or variations such as “comprises” or “comprising”, will be understood to imply the inclusion of a stated element, step, or group of elements, steps, but not the exclusion of any other element, or step, or group of elements, or steps.
The use of the expression “at least” or “at least one” suggests the use of one or more elements or ingredients or quantities, as the use may be in the embodiment of the disclosure to achieve one or more of the desired objects or results.
While considerable emphasis has been placed herein on the components and component parts of the preferred embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the disclosure. These and other changes in the preferred embodiment as well as other embodiments of the disclosure will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the disclosure and not as a limitation.
,CLAIMS:WE CLAIM:
1. A system (100) for evaluating client engagement, said system (100) comprises:
a repository (102) configured to store a pre-determined conversion rules, a first look up table having a list of prospective questions and at least one user ID corresponding to said question, a second look up table having a list of metrics and a scale corresponding to each of said metric, a third look up table having a list of engagement levels and a range of values corresponding to each engagement level, a fourth look up table having a list of responses and a value corresponding to each of said response and a fifth look up table having a list of parameters and a plurality of attributes corresponding to each of said parameter;
an assessment module (104) configured to cooperate with said repository (102) to:
• display a set of questions from said prospective questions corresponding to a user ID of a first user associated with a first communication device (112), and further configured to receive a first set of responses for said set of questions, wherein a question from said set of questions is selected based on analysis of historical responses provided by said user for displayed questions;
• display said attributes on said first communication device (112) to receive a second set of inputs corresponding to said attributes based on said fifth look up table; and
• display said metrics on a second communication device (114) associated with a second user and receive a third set of inputs corresponding to said metrics attributes based on said second look up table,
a first summation unit (106) configured to cooperate with said assessment module (104) to compute summation of said second set of inputs;
a second summation unit (108) configured to cooperate with said assessment module (104), said repository (102) and said first summation unit (106) to:
• convert said third set of inputs into a set of scores based on said scale corresponding to each of said metric using said pre-determined conversion rules;
• extract said value for each of said response in first set of responses based on said fourth look up table;
• aggregate said extracted values and said set of scores to generate said first aggregated value; and
• aggregate said first aggregated value with said summation of second set of inputs to compute a score value,
an identifier (110) configured to cooperate with said second summation unit (108) and said repository (102) to identify the engagement level of said client based on said computed score value and said third look up table,
wherein said assessment module (104), said first summation unit (106), said second summation unit (108), and said identifier (110) are implemented using one or more processor(s).
2. The system (100) as claimed in claim 1, wherein said first and said third set of inputs pertain to a set of parameters selected from the group corresponding to contractual factors, strategical factors, tactical factors and operational factors, and said second set of inputs pertain to said parameters, based on said fifth look up table, selected from the group corresponding to factors based on business and customer, and factors corresponding to people.
3. The system (100) as claimed in claim 1, wherein said assessment module (104) includes:
• a first crawler and extractor (116) configured to crawl through said first look up table to extract said question corresponding to said user ID;
• an analyser (118) configured to cooperate with said first communication device (112) to receive said response, and further configured to cooperate with said first crawler and extractor (116) to analyse said at least one historical response corresponding to said displayed questions and select said subsequent question based on said first look up table and said analysis; and
• a second crawler and extractor (120) configured to crawl through said second look up table to extract said metrics; and
• a third crawler and extractor (122) configured to crawl through said fifth look up table to extract said attributes corresponding to said parameters.
4. The system (100) as claimed in claim 3, which includes a communication unit (134) configured to cooperate with said first crawler and extractor (116), said second crawler and extractor (120) and said third crawler and extractor (122) to:
• transmit said extracted questions and said subsequent questions, and said extracted attributes to said first communication device (112);
• transmit said extracted metrics to said second communication device (114); and
• receive said first set of responses, said second set of inputs and said third set of inputs.
5. The system (100) as claimed in claim 1, wherein said second summation unit (108) includes:
• a fourth crawler and extractor (124) configured to crawl through said second look up table to extract said scale corresponding to said metric;
• a conversion unit (126) configured to cooperate with said fourth crawler and extractor (124) to convert said third set of inputs into said set of scores based on said scale and said pre-determined conversion rules;
• a fifth crawler and extractor (128) configured to crawl through said fourth look up table to extract said value for each of said response in first set of responses;
• a first aggregator (130) configured to cooperate with said conversion unit (126) and said fifth crawler and extractor (128) to aggregate said extracted values and said set of scores to generate said first aggregated value; and
• a second aggregator (132) configured to cooperate with said first aggregator (130) and said first summation unit (106) to aggregate said first aggregated value with said summation of said second set of inputs to compute said score value.
6. The system (100) as claimed in claim 1, wherein said assessment module (104) perform said assessment based on natural language processing which identifies contextual patterns for fine tuning and selecting said questions.
7. The system (100) as claimed in claim 1, wherein said second summation unit (108) performs conversion to remove at least one outlier to produce an accurate analysis in case of discrepancies.
8. The system (100) as claimed in claim 1, wherein said pre-determined conversion rules are based on machine learning and big data analysis.
9. The system (100) as claimed in claim 1, which includes a NLP module (136) wherein said NLP module (136) is configured to generate said prospective questions based on a pre-stored set of periodically updated keywords and a set of pre-determined natural language processing rules that are pre-stored in said repository (102).
10. A method to evaluate client engagement, said method comprises the steps of:
• storing (202), by a repository (102), a pre-determined conversion rules, a first look up table having a list of prospective questions and at least one user ID corresponding to said question, a second look up table having a list of metrics and a scale corresponding to each of said metric, a third look up table having a list of engagement levels and a range of values corresponding to each engagement level, a fourth look up table having a list of responses and a value corresponding to each of said response and a fifth look up table having a list of parameters and a plurality of attributes corresponding to each of said parameter;
• displaying (204), by an assessment module (104), a set of questions from said prospective questions corresponding to a user ID of a first user associated with a first communication device (112);
• receiving (206), by said assessment module (104), a first set of responses for said set of questions, wherein a question from said set of questions is selected based on analysis of historical responses provided by said user for displayed questions;
• displaying (208), by said assessment module (104), said attributes on said first communication device (112) to receive a second set of inputs corresponding to said attributes;
• displaying (210), by said assessment module (104), said metrics on a second communication device (114) associated with a second user and receive a third set of inputs corresponding to said metrics;
• computing (212), by a first summation unit (106), summation of said second set of inputs;
• converting (214), by a second summation unit (108), said third set of inputs into a set of scores based on said scale corresponding to each of said metric using said pre-determined conversion rules;
• extracting (216), by said second summation unit (108), said value for each of said response in first set of responses based on said fourth look up table;
• aggregating (218), by said second summation unit (108), said extracted values and said set of scores to generate said first aggregated value;
• aggregating (220), by said second summation unit (108), said first aggregated value with said summation of second set of inputs to compute a score value; and
• identifying (222), by an identifier (110), the engagement level of said client based on said computed score value and said third look up table.
Dated this 22nd day of October, 2019
_______________________________
MOHAN RAJKUMAR DEWAN, IN/PA – 25
of R.K.DEWAN & CO.
Authorized Agent of Applicant
TO,
THE CONTROLLER OF PATENTS
THE PATENT OFFICE, AT MUMBAI
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201821040105-Correspondence to notify the Controller [09-12-2024(online)].pdf | 2024-12-09 |
| 1 | 201821040105-STATEMENT OF UNDERTAKING (FORM 3) [24-10-2018(online)].pdf | 2018-10-24 |
| 1 | 201821040105-US(14)-HearingNotice-(HearingDate-10-12-2024).pdf | 2024-11-12 |
| 2 | 201821040105-US(14)-HearingNotice-(HearingDate-10-12-2024).pdf | 2024-11-12 |
| 2 | 201821040105-PROVISIONAL SPECIFICATION [24-10-2018(online)].pdf | 2018-10-24 |
| 2 | 201821040105-CLAIMS [03-11-2021(online)].pdf | 2021-11-03 |
| 3 | 201821040105-CLAIMS [03-11-2021(online)].pdf | 2021-11-03 |
| 3 | 201821040105-FER_SER_REPLY [03-11-2021(online)].pdf | 2021-11-03 |
| 3 | 201821040105-PROOF OF RIGHT [24-10-2018(online)].pdf | 2018-10-24 |
| 4 | 201821040105-FER_SER_REPLY [03-11-2021(online)].pdf | 2021-11-03 |
| 4 | 201821040105-OTHERS [03-11-2021(online)].pdf | 2021-11-03 |
| 4 | 201821040105-POWER OF AUTHORITY [24-10-2018(online)].pdf | 2018-10-24 |
| 5 | 201821040105-OTHERS [03-11-2021(online)].pdf | 2021-11-03 |
| 5 | 201821040105-FORM 13 [29-10-2021(online)].pdf | 2021-10-29 |
| 5 | 201821040105-FORM 1 [24-10-2018(online)].pdf | 2018-10-24 |
| 6 | 201821040105-RELEVANT DOCUMENTS [29-10-2021(online)].pdf | 2021-10-29 |
| 6 | 201821040105-FORM 13 [29-10-2021(online)].pdf | 2021-10-29 |
| 6 | 201821040105-DRAWINGS [24-10-2018(online)].pdf | 2018-10-24 |
| 7 | 201821040105-RELEVANT DOCUMENTS [29-10-2021(online)].pdf | 2021-10-29 |
| 7 | 201821040105-FER.pdf | 2021-10-18 |
| 7 | 201821040105-DECLARATION OF INVENTORSHIP (FORM 5) [24-10-2018(online)].pdf | 2018-10-24 |
| 8 | 201821040105-FER.pdf | 2021-10-18 |
| 8 | 201821040105-ORIGINAL UR 6(1A) FORM 1-210519.pdf | 2020-01-10 |
| 8 | 201821040105-Proof of Right (MANDATORY) [21-05-2019(online)].pdf | 2019-05-21 |
| 9 | 201821040105-FORM 18 [22-10-2019(online)].pdf | 2019-10-22 |
| 9 | 201821040105-ORIGINAL UR 6(1A) FORM 1-210519.pdf | 2020-01-10 |
| 9 | Abstract1.jpg | 2019-10-25 |
| 10 | 201821040105-COMPLETE SPECIFICATION [22-10-2019(online)].pdf | 2019-10-22 |
| 10 | 201821040105-ENDORSEMENT BY INVENTORS [22-10-2019(online)].pdf | 2019-10-22 |
| 10 | Abstract1.jpg | 2019-10-25 |
| 11 | 201821040105-COMPLETE SPECIFICATION [22-10-2019(online)].pdf | 2019-10-22 |
| 11 | 201821040105-DRAWING [22-10-2019(online)].pdf | 2019-10-22 |
| 12 | 201821040105-COMPLETE SPECIFICATION [22-10-2019(online)].pdf | 2019-10-22 |
| 12 | 201821040105-DRAWING [22-10-2019(online)].pdf | 2019-10-22 |
| 12 | 201821040105-ENDORSEMENT BY INVENTORS [22-10-2019(online)].pdf | 2019-10-22 |
| 13 | 201821040105-ENDORSEMENT BY INVENTORS [22-10-2019(online)].pdf | 2019-10-22 |
| 13 | 201821040105-FORM 18 [22-10-2019(online)].pdf | 2019-10-22 |
| 13 | Abstract1.jpg | 2019-10-25 |
| 14 | 201821040105-Proof of Right (MANDATORY) [21-05-2019(online)].pdf | 2019-05-21 |
| 14 | 201821040105-ORIGINAL UR 6(1A) FORM 1-210519.pdf | 2020-01-10 |
| 14 | 201821040105-FORM 18 [22-10-2019(online)].pdf | 2019-10-22 |
| 15 | 201821040105-DECLARATION OF INVENTORSHIP (FORM 5) [24-10-2018(online)].pdf | 2018-10-24 |
| 15 | 201821040105-FER.pdf | 2021-10-18 |
| 15 | 201821040105-Proof of Right (MANDATORY) [21-05-2019(online)].pdf | 2019-05-21 |
| 16 | 201821040105-DECLARATION OF INVENTORSHIP (FORM 5) [24-10-2018(online)].pdf | 2018-10-24 |
| 16 | 201821040105-DRAWINGS [24-10-2018(online)].pdf | 2018-10-24 |
| 16 | 201821040105-RELEVANT DOCUMENTS [29-10-2021(online)].pdf | 2021-10-29 |
| 17 | 201821040105-DRAWINGS [24-10-2018(online)].pdf | 2018-10-24 |
| 17 | 201821040105-FORM 1 [24-10-2018(online)].pdf | 2018-10-24 |
| 17 | 201821040105-FORM 13 [29-10-2021(online)].pdf | 2021-10-29 |
| 18 | 201821040105-FORM 1 [24-10-2018(online)].pdf | 2018-10-24 |
| 18 | 201821040105-POWER OF AUTHORITY [24-10-2018(online)].pdf | 2018-10-24 |
| 18 | 201821040105-OTHERS [03-11-2021(online)].pdf | 2021-11-03 |
| 19 | 201821040105-POWER OF AUTHORITY [24-10-2018(online)].pdf | 2018-10-24 |
| 19 | 201821040105-PROOF OF RIGHT [24-10-2018(online)].pdf | 2018-10-24 |
| 19 | 201821040105-FER_SER_REPLY [03-11-2021(online)].pdf | 2021-11-03 |
| 20 | 201821040105-PROVISIONAL SPECIFICATION [24-10-2018(online)].pdf | 2018-10-24 |
| 20 | 201821040105-PROOF OF RIGHT [24-10-2018(online)].pdf | 2018-10-24 |
| 20 | 201821040105-CLAIMS [03-11-2021(online)].pdf | 2021-11-03 |
| 21 | 201821040105-PROVISIONAL SPECIFICATION [24-10-2018(online)].pdf | 2018-10-24 |
| 21 | 201821040105-STATEMENT OF UNDERTAKING (FORM 3) [24-10-2018(online)].pdf | 2018-10-24 |
| 21 | 201821040105-US(14)-HearingNotice-(HearingDate-10-12-2024).pdf | 2024-11-12 |
| 22 | 201821040105-Correspondence to notify the Controller [09-12-2024(online)].pdf | 2024-12-09 |
| 22 | 201821040105-STATEMENT OF UNDERTAKING (FORM 3) [24-10-2018(online)].pdf | 2018-10-24 |
| 1 | SearchStrategy201821040105E_31-03-2021.pdf |