Abstract: Systems and methods for automatically evaluating performance in an organization are disclosed. In one implementation, an assessment system captures performance data and the corresponding profiles of candidates to be evaluated. The performance data is based on test scores, psychometric variables, and professional behavior indexes. After cleaning and preprocessing the performance data, the system mines for rules, pattems and knowledge in the data to generate analytics. The analytics are used for predicting or projecting the performance metrics of the candidates. In addition, the system builds a behavioral model from video data including gestures, expressions, and emotions. The behavioral model of the system provides additional validation to the projected performance metrics. These projected performance metrics can be used as decision support parameters by an evaluator.
TECHNICAL FIELD
The present subject matter, in general, relates to a performance assessment system for an organization and, in particular, relates to performance testing and analysis over a period.
BACKGROUND
Performance evaluation based on specific parameters is required in almost every organization such as an office, school, college, training center, etc. For example, competency evaluation of a candidate at the time of employment is imperative for any organization. However, typical recruitment processes for hiring human resources are usually susceptible to hiring loopholes such as human errors and biases that could lead to hiring of unskilled and incompetent work force in an organization. Hence, typically, organizations place new recruits on probation for few months. During the probation period, the new recruits are trained and their performance is monitored and evaluated. Based on the performance evaluation, the employment of the new recruits may be confirmed or terminated.
In addition, organizations need to evaluate the performance of their employees on a regular basis. Such performance evaluation process is usually linked to appraisals and/or promotions of an employee. Most organizations depend only on the analysis done by their human resource (HR) personnel for recruiting people and monitoring their performance over a period of time.
Generally, the HR personnel of an organization keep a record of the performance of the new recruits and employees. The HR personnel maintain and update the record by collecting data from different sources, such as, various training centers over a period. The collected data helps in monitoring, and evaluation of the performance. The process of performance monitoring and evaluation for confirmation of employment, or determining and enabling the career path of employees tends to get cumbersome for the HR personnel, especially as an organization grows in size. This is especially true in Organizations that have a high employee turnover rate. Further, since the process of performance monitoring and evaluation depends mainly on assessment by individuals, it is slow, subject to human errors, and biases.
Therefore, there is a need for to develop a performance assessment system that involves a comprehensive approach involving increased objectivity and minimal assessment by individuals for reliably and quick performance assessment of new recruits and employees.
SUMMARY
This summary is provided to introduce systems and methods for evaluation of the performance of a person, which is further described below in the Detailed Description. This summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter.
In one embodiment, an assessment system evaluates performance of a person or such as a candidate for various purposes including hiring, confirmation, salary appraisals, etc., with the help of one or more evaluation tools and related parameters. The performance-related data obtained by the candidate as per the parameters are processed to generate performance indices. These performance indices along with profile data of the candidate is used to generate analytics, based upon which the candidate may be classified into one or more categories.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other features, aspects, and advantages of the subject matter will become better understood with regard to the following description, appended claims, and accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference number in different figures indicates similar or identical items.
Fig. 1 illustrates an exemplary assessment environment for performance evaluation of various candidates.
Fig. 2 illustrates an exemplary assessment system for performance evaluation of various candidates.
Fig. 3 illustrates working of an exemplary examination module included in the exemplary assessment system.
Fig. 4 illustrates interactions between various modules of the exemplary assessment system.
Fig. 5 illustrates an exemplary method for carrying out real time assessment.
Fig. 6 illustrates an exemplary method for checking program codes in laboratory assessment.
Fig. 7 illustrates an exemplary assessment matrix.
Fig. 8 illustrates an exemplary interface for entering an evaluation pattern for a course.
Fig. 9 illustrates an exemplary method for evaluating performance of a candidate using an assessment system.
Fig. 10 illustrates exemplary performance plots.
DETAILED DESCRIPTION
Methods and systems for evaluating the performance of candidates such as new recruits or existing employees in any organization, using an assessment system, are described. Organizations such as offices, schools, colleges and the like routinely assess performance of various candidates such as new recruits, employees, students, trainees and so forth based on various pre-defined parameters. Such assessment may be used for different purposes, such as confirming employment of a new recruit, selecting a future leader from a group of candidates, monitoring the performance of the members in a group and so on. Evaluating and manually processing data of a large number of candidates can be a difficult task for the management and is also susceptible to factors like human errors and biases. Thus, a robust assessment system for performance evaluation that can minimize the above-mentioned errors and biases is required.
The assessment system disclosed here can capture and analyze a large amount of performance-related data of candidates. The performance-related data may generally include various test scores, test weight, psychometric variables, and other professional behavioral parameters. After processing the performance-related data in conjunction with other data such as profile data, the system mines for rules and knowledge in the data. Mining for rules and knowledge involves extracting relevant information out of the preprocessed data. The knowledge and rules are then used for predicting or projecting the performance metric of a person or a candidate. This projected performance metric can be used as a decision support parameter by an evaluator.
The following disclosure describes systems and methods for performance evaluation using an assessment system. While aspects of the described systems and methods can be implemented in any number of different computing systems, environments, and/or configurations, embodiments for the assessment system are described in the context of the following exemplary system(s) and method(s).
The following description is in reference to an assessment system implemented in a business organization for prospective employees undergoing a training session at the organization. However, the assessment system can be extended to any organization where performance of a number of people needs to be evaluated.
Fig. 1 illustrates an exemplary assessment environment 100 for performance evaluation of candidates, according to an embodiment of the present subject matter. Candidates 102-1 to 102-n may either be applicants for a job or existing employees/ students in an organization. The candidates 102-1 to 102-n are collectively referred to as candidates 102 hereinafter. Performance related data of the candidates 102 can be collected regularly from a number of assessment centers 103 of the organization over a period. Assessment centers 103-1 to 103-n may be either classrooms, laboratories or any other training center, and are collectively referred to as assessment centers 103 hereinafter. The assessment centers 103 may conduct a number of training sessions and evaluate the candidates 102 attending these training sessions with the help of evaluation tools 104.
The evaluation tools 104 can evaluate candidates 102 on various domains such as technology, professionalism, communication skills, soft skills, and so on. The evaluation tools 104 include a number of technical, psychometric, communication, and behavioral analysis tests, for example, real time assessment tests, assignments, weekly tests, evaluation laboratories, case studies, and so on. The evaluation tools 104 may be accessed by candidates 102 taking tests at the various assessment centers 103. The performance scores, of various candidates 102, associated with the evaluation tools 104, and corresponding course credits are collectively referred as evaluation data 105. Course credits, i.e. the weightages assigned to the various evaluation tools 104 and can be based on a pre-defined course template. For each independent purpose, a different course template can be designed and can be given to each of the evaluation tools 104. The evaluation data 105 is provided to an assessment system 106 via a network 107. .The assessment system 106 can be a server, which is implemented over a network 107, or a stand alone device. The network 107 may be the Internet, a private network such as LAN, WAN, or any other network.
The assessment system 106 captures the evaluation data 105 along with other data and processes the data. Based on the processed data, the assessment system 106 classifies each of the candidates 102 in various categories, such as good, potentially good, bad, and so on. Based on the classification, an evaluator 108 can take a decision regarding selection, promotion, etc., of a candidate 102.
Fig. 2 illustrates an exemplary assessment system 106. The assessment system 106 can includes one or more processor(s) 202 coupled to a memory 204. The processor(s) 202 may include, for example, microprocessors, microcomputers, microcontrollers, digital signals processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate data based on operational instructions. The processor(s) 202 can be configured to fetch and execute computer-program instructions stored in the memory 204. The memory 204 may further include, for example, one or more volatile memory (e.g., RAM) and non-volatile memory (e.g., ROM, Flash).
The memory 204 may also include program(s) 206 and data 208. The processor(s) 202 fetches and executes computer program instructions from the program(s) 206. The program(s) 206, for example, may include instructions to assess performance of the candidates 102 based on pre-defined parameters.
The program(s) 206 may further include an examination module 210, a mining module 212, a classification module 214, and other modules 216. Data 208, on the other hand, may include evaluation datal05 and other data 220.
The examination module 210 processes the evaluation data 105 and generates performance indices, (described in Fig.3), for each of the candidates 102. The mining module 212 mines out relevant information from the performance indices along with other data 220 such as educational background and generates a knowledge base. The classification module 214 uses the knowledge base to classify the candidates 102. In addition, the classification module 214 can also prepare a behavioral model based on their gestures, expressions, and emotions of the candidates 102. The behavioral model can then be used to validate classifications made by the classification module 214. The other modules 216 may include operating system, application programs and so on.
Fig. 3 illustrates the working of the examination module 210 included in the assessment system 106. The examination module 210 assesses the performance of the candidates 102 based on the evaluation done by the evaluation tools 104. The evaluation tools 104 may include technical skill tools 302, professional skill tools 304, communication, and soft skill tools 306, etc., that help in assessing the performance of the candidates 102.
The technical skill tools 302 can assess technical skills of a candidates 102 using various methodologies such as real time assessments, assignments, weekly tests, laboratory (lab) assessments, case studies, and so on. A real time assessment of the candidates 102 can be carried out during a training session such as a lecture to track the listening, attention and recall skills of the candidates 102. A real time assessment process is elaborated later in the description of Fig 5.
Assignments can be any assignment such as weekly assignments that help in assessing the continuity in learning of the candidates 102 even away from the training environment. The assignments can promote learning, as the candidates 102 will seek relevant information in books or on the Internet, etc., to excel in a particular subject or topic. Group assignments can pave the way for team building and sharing of information, which may not be possible in any other mode of assessment. Weekly tests based on the assignments can also be conducted in the form of a quiz, programming exercises in a computer lab, aptitude tests, and so on. The weekly tests ensure that the candidates 102 make steady progress in their learning and supposedly boost their confidence to prepare them for higher levels of abstraction. This tests the retention power of the candidates 102 as against the recall test during real time assessment.
These and other tests can help the candidates 102 figure out how effectively or how well they have understood and absorbed crucial concepts and their applications. The tests may also help the evaluator 108 to evaluate and track the progress of the candidates 102 and help those who need extra attention and guidance.
Lab assessment is a way of testing practical application skills of the candidates 102 in business-like scenario. For example, lab assessment can test the programming skills of the candidates 102. Various programming tests can be formulated and made to run on computers being assigned to the candidates 102. A programming lab assessment is described in detail later in the description of Fig 6. In another example, lab assessment can include a sales management exercise. Case studies used in the assessment process can be real-life applications of medium complexity that simulate conditions of time and work pressure to test real-world levels of productivity of the candidates 102.
Professional skill tools 304 may include tests for understanding of professionalism-related aspects including professional etiquette, corporate culture, customer expectations, and work-life balance and so on. Similarly, communication and soft skill tools 306 may include tests for efficiency in presentations, team interaction, leadership and other soft skills.
The evaluation data 105 is fed to the examination module 210. The examination module 210 processes the evaluation data 105 to generate performance indices 308 such as technical indices, communication indices, and professional indices for the candidates 102. Based on the performance indices 308, an assessment matrix can be build for each of the candidates 102. The assessment matrix is described in detail later in the description of Fig 7.
Fig. 4 illustrates interaction of various modules of the assessment system 106. The working of the assessment system 106 includes interaction of the examination module 210, the mining module 212, and the classification module 214. As described earlier, the evaluation data 105 is processed by the examination module 210 to generate performance indices 308 such as technical indices, communication indices, and professional indices for the candidates 102. The examination module 210 also generates an assessment matrix based on aforementioned performance indices 308.
In one implementation, technical indices are based on parameters such as logical ability, problem solving skills, programming skills, and core technology skills of the candidates 102. Thus, the technical indices may be generated by the examination module 210 from the evaluation data 105 of the technical skill tools 302. Professional indices are based on ethics, discipline, attendance, and customer orientation of the candidates 102. Thus, the professional indices may be generated by the examination module 210 from the evaluation data 105 of the professional skill tools 304. Similarly, communication indices are based on listening skills, writing skills, team skills, and etiquettes of the candidates 102. Therefore, the communication indices may be generated by the examination module 210 from the evaluation data 105 of the communication skill tools 306. These performance indices 308 are fed to the mining module 212.
In addition to the performance indices 308 obtained from the examination module 210, the mining module 212 also receives profile data 402, which is incorporated in other data 220 stored in data 208, of the candidates 102. The profile data 402 can be any document providing background information of the candidates 102 such as curriculum vitae, bio-data or any other profile related data. The mining module 212 mines the received data and generates a set of association rules, knowledge, and behavioral models. The mining module 212 can find correlations or patterns in the data contained in the performance indices 308 and the profile data 402.
Further, the mining module 212 generates analytics 404 across different pre-defined dimensions such as technical, communication, professional, etc. Generally, the analytics 404 depict trends in the performance of the candidates 102 and predict how a candidate may perform in future. The future performance of the candidates 102, are predicted, for example, through optimal or realistic decision from a variety of available options, based on existing data. Business managers, for example, usually make decisions based either on experiences or there might be other qualitative aspects of their decision making ability. In one implementation, the mining module 212 generates analytics 404 in the form of rules, patterns and regression based knowledge.
The rules can include classification rules and association rules. The classification rules can be generated through decision trees. The association rules can be obtained through conventional and fuzzy association rule mining processes. The processes generate rules associated with a confidence and support value. The rules with very high confidence and support values are filtered for performance projection. For example, a rule can be: "if performance of a candidate in an evaluation lab is excellent, then his or her programming ability is rated as a high value". The mining module 212 may automatically validates the rules using feedback from the evaluator 108.
The patterns can include knowledge portraying the association or grouping of the candidates 102 based on certain attributes or characteristics. In an implementation, a pattern can be a cluster comprising a set of good performers from a particular institution or a place. In another implementation, a pattern can be a particular gender belonging to certain kinds of background and performing excellently. In yet another implementation, a pattern can be a cluster of candidates belonging to a certain part of a country with a steep learning curve.
The patterns result in models through which the candidates 102 can be classified.
The regression based knowledge can be extracted from statistical properties or distributions of various attributes or parameters characterizing the candidates 102. The regression based knowledge can also be extracted from time series data obtained from daily performance metric and cumulative performance indices of the candidates 102. Regression models can be generated through parametric and non-parametric techniques, known in the art. Regression models may generate a trend corresponding to various attributers or parameters and that may help to forecast the performance of the candidates 102. The mining module 212 may also use statistical and neural network model for generating regression based knowledge.
The analytics 404, generated in the form of rules, patterns and regression based knowledge, may include parameters such as profile vs. performance, psychometric test analysis vs. performance, homogeneous groups and the knowledge base of groups, communication skills vs. performance, professional index vs. performance association rule and many more. These analytics 404 can provide a knowledge base for assessing performance of the candidates 102 based on an assessment of the above-mentioned parameters. The analytics 404 are then fed into the classification module 214.
The classification module 214 uses the analytics 404 to classify the candidates 102 into categories such as good, potentially good, bad, etc. In one implementation, the classification module 214 also receives video data 406, which is incorporated in other data 220 stored in data 208, of each of the candidates 102. The video data 406 includes video captures of the candidates 102 taken over a period of time. The classification module 214 builds a behavioral model from the video data 406 based on gestures, expressions, and emotions of the candidates 102. The behavioral model provides additional validation to the classification produced by the classification module 214. This classification can be used to project a decision index 408 for each of the candidates 102, which can be used as a decision support parameter by the evaluator 108. The decision index 408 can be a measure of potential of the candidates 102 for a particular type of job.
Fig. 5 illustrates a method for carrying out real-time assessment of candidates 102 by the technical skill tools 302. Real-time assessment can be carried out to track the listening and attention skills of the candidates 102. In one implementation, real-time assessment can be carried out with the help of a real-time assessment system that may include a response device, a receiver, a computing device(s), and the evaluation tools 104.
At block 502, the real-time evaluation system receives an option as a response to a real time evaluation question such as a clicker question. For example, during a lecture, a trainer can ask questions to the candidates 102 either verbally or on a presentation device, and the candidates 102 can use the response device to answer the questions. The response device can have a plurality of buttons. The trainer can pose multiple-choice questions and the candidates 102 can press one or more of the buttons present on the response device corresponding to the answer option(s). Each response device can be associated with a unique ID. The unique ID can be associated with each of the candidates 102 to whom the response device is allotted.
The response device can send data associated with the selected button to the receiver. The response device can use any wired or wireless transmission medium such as conductor cable, optical fiber cable, infrared waves, Bluetooth and others to transmit the data to the receiver. In one implementation, the response device can include an infrared transmitter device that can send the data to an infrared enabled receiver using infrared transmission.
At block 504, the response is evaluated. In one implementation, the receiver can pass the response received from one or more response device(s) to the computing device. The computing device can assimilate the data received from the receiver along with the unique ID of each of the response device(s) that have sent the data.
At block 506, an output list is generated. In one implementation, the computing device can tabulate the received data and the corresponding unique ID(s), indicating the response of each of the candidates 102. This tabulated result can be the output list. The output list may further include statistical information such as the number of correct responses, topics on which correct responses were received, etc. This output list can be fed to the examination module 210. The examination module 210 can utilize this output list to generate assessment matrices for each of the candidates 102.
In this manner, real time assessment can be used to maintain a record of performance/ learning ability of each of the candidates 102 throughout a lecture or a training session. A record can be maintained for the performance of the candidates 102 during all such sessions to obtain a thorough and clear report of their understanding of key technologies, concepts, etc.
Fig. 6 illustrates an exemplary lab assessment method to evaluate program codes by the technical skill tools 302. In programming lab assessment, candidates 102 can learn from playful interactions with software to more business-like assignments. The programming output of the candidates 102 can be constantly evaluated using automated code checkers. In one implementation, an automatic code checker is an auto-judging device that may be built on the concept of artificial intelligence.
At block 602, a program source code along with a programming language is fed in the automatic code checker. The program source code can be a programming code written by any of the candidates 102. The programming language can be any standard programming language such as C, C++, Java, and Pascal and so on.
At block 604, a user ID and a problem ID is fed to a lab assessment tool. In an implementation, each of the candidates 102 is associated with a unique identification number called the user ID. Similarly, each programming problem given to the candidates 102 can be assigned a unique identification number called the problem ID.
At block 606, the program source code is evaluated by the automatic code checker. In one implementation, the automatic code checker can process the program source code written by a candidate 102-1 and produce an evaluated result of the written code. The result can include a list of errors in the code and may suggest alternatives to improve the quality of the code. Thus, assignments carried out in a lab are given immediate, precise, objective feedback, which enables learners to prepare better for real-life case studies.
Fig. 7 illustrates an exemplary assessment matrix 700 generated for each of the candidates 102. In an implementation, the examination module 210 generates the assessment matrix 700 for each of the candidates 102, say candidate 102-n from the performance indices 308 formulated based on results received from the evaluation tools 104. The assessment matrix 700 indicates the performance of the candidate 102-n across various evaluation parameters. The assessment matrix 700 can be made available to the candidate 102-n and to the evaluator 108 as an index of the capability of the candidate 102-n.
The assessment matrix 700 provides an indication of strength and weaknesses of the candidate 102-n. The evaluator 108 can also take help from the assessment matrix 700 to assess the growth of the candidate 102-n. The assessment matrix 700 maps a series of parameters such as technical skills 302, communication and soft skills 306, attitude, confidence, team skills, and so on.
The assessment matrix 700 can be in the form of a table with any number of rows and columns. The assessment matrix 700 tabulates scores obtained by the candidate 102-n when assessed by the evaluation tools 104. The performance of the candidate 102-1 can be recorded on periodic basis, such as daily, weekly, monthly or for any other period, in the assessment matrix 700. In one embodiment the assessment matrix is build on daily basis. The columns in the assessment matrix 700 that indicate the number of clicker questions attended and the number of clicker questions answered correctly are a measure of real-time assessment. Similarly, there can be various columns for weekly performance scores, lab assessment scores, case study scores, etc., in the assessment matrix 700. The assessment matrix 700 can also include columns such as daily performance metric (daily PM) and cumulative performance metric (CPM) indicating the daily performance and cumulative performance related scores of the candidate 102-n over a period.
In one implementation, based on the assessment matrix 700, daily performance plots can be drawn for the candidate 102-n based on the daily PM and CPM. Performance plots can also be made for a group of candidates studying a common course. These performance plots can help in analyzing the performance of an individual or a group over a period of time. Such performance plots elaborated in the description of Fig 10.
The daily PM and CPM can be computed from the scores as simple average, weighted average, moving average or any other data processing method based on parameters provided by people in-charge of the tests (course in-charge) or the evaluator 108.
Fig. 8 illustrates an exemplary interface for a course in-charge to enter an evaluation pattern for a course. The number and the type of columns shown in the figure are intended to explain the concept at a basic level and do not limit actual implementation of such an interface.
Generally, each course is associated with a particular credit value. The evaluation pattern is defined by a number of tests of different types and weightages associated with each of these tests. The course in-charge or the evaluator 108 can also be provided with another interface to enter a formula or equation for computing daily performance metric (DPM) of the candidates 102 a particular course. In one embodiment, the overall DPM is calculated as a weighted average of all the courses and CPM can be calculated as the average sum of all DPM's for D number of days.
Methods performed by systems described above for evaluating performance of a number of candidates 102 can be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract data types.
Fig. 9, illustrating a method 900 for evaluating performance of candidates 102, is described with reference to the system 100. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or an alternate method.
At block 902, various evaluation tests are conducted for candidates 102 at various assessment centers 103. In one implementation, a candidate, say candidate 102-n, takes various tests provided by the evaluations tools 104. The candidate 102-n may be a person undergoing a training program in an organization or may be an employee of the organization undergoing a process of performance appraisal. The evaluation tools 104 evaluate the candidates 102 on various dimensions such as domain knowledge, professionalism, communication, and soft skills. The evaluation tools 104 may include a number of technical, psychometric, communication, and behavioral analysis test tools, for example, real time assessment tools, tools for assignments, weekly tests, lab assessment, case studies, and so on.
At block 904, scores obtained by the candidates 102 in the evaluation tests along with course credits are fed to the assessment system 106 as evaluation data 105. The evaluation data 105 is processed to generate a performance indices 308 of the candidates. In one implementation, the examination module 210 processes the evaluation data 105of the candidate 102-1 provided on the evaluation tools 104 to generate performance indices 308 for the candidate 102-1. The performance indices 308 generated by the examination module 210 includes a technical index, a communication index, and a professional index of the candidates 102. Technical index may be based on logical, problem solving, programming, and core technology skills of the candidate 102-1. Communication index may be based on listening skills, writing skills, team skills, and etiquettes of the candidates 102. Likewise, professional index may be based on factors such as ethics, discipline, attendance, and customer orientation of the candidates 102. Based on the performance indices 308, an assessment matrix, such as the matrix 700, can be prepared for each of the candidate 102.
At block 906, performance indices 308 and profile data 402 of the candidates 102 are processed to generate analytics 404 for the candidates 102. In one implementation, the mining module 212 processes the performance indices 308 and the profile data 402 of the candidates 102 to generate the analytics 404. The mining module 212 mines the input data from preprocessed information about the candidates 102 and generates a set of association rules, knowledge, and behavioral models. The mining module 212 filters the received data and obtains relevant details from it. The mining module 212 then generates the various analytics 404 based on the relevant details.
At block 908, the analytics 404 are processed along with the video data 406 to classify the candidates 102. In one implementation, the video data 406 includes video captures of the candidates 102 taken over the period of assessment. The classification module 214 builds a behavioral model from the video data 406 based on gestures, expressions, and emotions of the candidates 102. The classification module 214 also analyzes the rules and knowledge provided by analytics 404 and classifies the candidates 102 as good, potentially good, and bad performer. The behavioral model provides additional validation to the classification.
Such classification, along with assessment details can be provided in any manner, such as, through a display, an e-mail notification, etc. The evaluator 108 can then take an informed decision regarding selection, promotion of the candidates 102. The classification and assessment details can also be provided to the candidates 102.
Fig. 10 illustrates exemplary performance assessment plots 1002 and 1008. Plot 1002 represents a performance index plot for a candidate, such as candidate 102-n. The plot 1002 shows a comparison between an alert line 1004 and performance 1006 of the candidate 102-n. The alert line 1004 represents the minimum acceptable performance, while the performance 1006 represents the actual performance of the candidate 102-n on various days, over a period of time over which the candidate 102-n was assessed. If the performance 1006 drops below the alert line 1004, the candidate 102-n may be assessed as being unsuitable for selection or promotion.
Plot 1008 represents a comparative plot of performance of various candidates 102. Such a plot can be prepared for each day of the assessment process over the period of assessment or for a cumulative performance comparison. These plots can help the evaluator 108 to monitor and choose from between the candidates 102.
Although embodiments of an assessment system have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as exemplary implementations of the assessment system for evaluating performance
I/We claim:
1. A system (106) for performance assessment, said system (106) comprising:
at least one processor (202);
a memory (204) coupled to said processor (202), said memory (204) including one or more modules comprising processor executable instructions;
an examination module (210) configured to process evaluation data (105) corresponding to performance in one or more tests and to generate performance indices (308), wherein said evaluation data (105) is provided by evaluations tools (104) over a network (107);
a mining module (212) configured to process said performance indices (308) and profile data (402) to generate analytics (404) in the form of one or more of rules, patterns and regression based knowledge; and
a classification module (214) configured to process said analytics (404) to generate decision index (408) for performance assessment.
2. The system as claimed in claim 1, wherein said examination module (210) is further configured to process said evaluation data (218) based on course credits.
3. The system as claimed in claim 1, wherein said evaluation tools (104) are selected from a group comprising of technical skill tools, professional skill tools, and communication skill tools.
4. The system as claimed in claim 1,wherein said tests include real time assessments, assignments, weekly tests, labs assessment tools, and case studies.
5. The system as claimed in claim 1, wherein said performance indices (308) are selected from a group consisting of technical indices, communication indices, and professional indices.
6. The system as claimed in claim 1, wherein said examination module (210) generates an assessment matrix (700) based on said performance indices (308).
7. The system as claimed in claim 1, wherein said classification module (214) is further configured to process video data (406) to generate a behavioral model.
8. The system as claimed in claim 7, wherein said classification module (214) validates the said performance assessment using said behavioral model.
9. A method (900) for performance assessment, said method (900) comprising:
receiving an evaluation data (218) at an examination module (210), wherein said
evaluation data (218) correspond to performance in one or more tests;
processing said evaluation data (218) by said examination module (210) to generate performance indices (308);
mining through said performance indices (308) and profile data (402) by a mining module (212) to generate analytics (404) in the form of one or more of rules, patterns and regression based knowledge; and
classifying the performance based on said analytics (404) by a classification module (214).
10. The method as claimed in claim 9, wherein said evaluation data is processed based on course credits.
11. The method as claimed in claim 9, wherein said performance indices (308) are selected from a group consisting of technical indices, communication indices, and professional indices.
12. The method as claimed in claim 9, wherein said processing comprising generating an assessment matrix (700) bases on said performance indices (308).
13. The method as claimed in claim 9, wherein said mining comprises extracting elements from said performance indices (308) and said profile data (402) to generate said analytics (404).
14. The method as claimed in claim 9, said classifying comprising processing video data (406) to analyze gestures, expressions and emotions to generate a behavioral model and validate said classification.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 349-CHE-2008 DESCRIPTION(PROVISIONAL) 11-02-2008.pdf | 2008-02-11 |
| 1 | 349-CHE-2008-Written submissions and relevant documents (MANDATORY) [15-09-2017(online)].pdf | 2017-09-15 |
| 2 | 349-CHE-2008 FORM-5 11-02-2009.pdf | 2009-02-11 |
| 2 | Correspondence by Agent_Power of Attorney_04-09-2017.pdf | 2017-09-04 |
| 3 | 349-CHE-2008-Correspondence to notify the Controller (Mandatory) [30-08-2017(online)].pdf | 2017-08-30 |
| 3 | 349-CHE-2008 FORM-3 11-02-2009.pdf | 2009-02-11 |
| 4 | 349-CHE-2008-FORM-26 [30-08-2017(online)].pdf | 2017-08-30 |
| 4 | 349-CHE-2008 FORM-2 11-02-2009.pdf | 2009-02-11 |
| 5 | 349-CHE-2008-HearingNoticeLetter.pdf | 2017-08-04 |
| 5 | 349-CHE-2008 FORM-1 11-02-2009.pdf | 2009-02-11 |
| 6 | 349-CHE-2008_EXAMREPORT.pdf | 2016-07-02 |
| 6 | 349-CHE-2008 DRAWING 11-02-2009.pdf | 2009-02-11 |
| 7 | Claims [30-05-2016(online)].pdf | 2016-05-30 |
| 7 | 349-CHE-2008 DESCRIPTION(COMPLETE) 11-02-2009.pdf | 2009-02-11 |
| 8 | Correspondence [30-05-2016(online)].pdf | 2016-05-30 |
| 8 | 349-CHE-2008 CORRESPONDENCE OTHERS 11-02-2009.pdf | 2009-02-11 |
| 9 | 349-CHE-2008 CLAIMS 11-02-2009.pdf | 2009-02-11 |
| 9 | Description(Complete) [30-05-2016(online)].pdf | 2016-05-30 |
| 10 | 349-CHE-2008 ABSTRACT 11-02-2009.pdf | 2009-02-11 |
| 10 | Examination Report Reply Recieved [30-05-2016(online)].pdf | 2016-05-30 |
| 11 | 349-CHE-2008 FORM-18 02-02-2010.pdf | 2010-02-02 |
| 11 | OTHERS [30-05-2016(online)].pdf | 2016-05-30 |
| 12 | 349-che-2008-form 3.pdf | 2011-09-02 |
| 12 | Correspondence [07-01-2016(online)].pdf | 2016-01-07 |
| 13 | 349-che-2008-form 1.pdf | 2011-09-02 |
| 13 | Description(Complete) [07-01-2016(online)].pdf | 2016-01-07 |
| 14 | 349-che-2008-correspondnece-others.pdf | 2011-09-02 |
| 14 | Examination Report Reply Recieved [07-01-2016(online)].pdf | 2016-01-07 |
| 15 | 0349-che-2008 power of attorney.pdf | 2011-09-02 |
| 15 | abstract349-CHE-2008.jpg | 2012-03-05 |
| 16 | 0349-che-2008 correspondence others.pdf | 2011-09-02 |
| 16 | 0349-che-2008 form-1.pdf | 2011-09-02 |
| 17 | 0349-che-2008 drawings.pdf | 2011-09-02 |
| 18 | 0349-che-2008 form-1.pdf | 2011-09-02 |
| 18 | 0349-che-2008 correspondence others.pdf | 2011-09-02 |
| 19 | 0349-che-2008 power of attorney.pdf | 2011-09-02 |
| 19 | abstract349-CHE-2008.jpg | 2012-03-05 |
| 20 | 349-che-2008-correspondnece-others.pdf | 2011-09-02 |
| 20 | Examination Report Reply Recieved [07-01-2016(online)].pdf | 2016-01-07 |
| 21 | 349-che-2008-form 1.pdf | 2011-09-02 |
| 21 | Description(Complete) [07-01-2016(online)].pdf | 2016-01-07 |
| 22 | 349-che-2008-form 3.pdf | 2011-09-02 |
| 22 | Correspondence [07-01-2016(online)].pdf | 2016-01-07 |
| 23 | 349-CHE-2008 FORM-18 02-02-2010.pdf | 2010-02-02 |
| 23 | OTHERS [30-05-2016(online)].pdf | 2016-05-30 |
| 24 | Examination Report Reply Recieved [30-05-2016(online)].pdf | 2016-05-30 |
| 24 | 349-CHE-2008 ABSTRACT 11-02-2009.pdf | 2009-02-11 |
| 25 | 349-CHE-2008 CLAIMS 11-02-2009.pdf | 2009-02-11 |
| 25 | Description(Complete) [30-05-2016(online)].pdf | 2016-05-30 |
| 26 | 349-CHE-2008 CORRESPONDENCE OTHERS 11-02-2009.pdf | 2009-02-11 |
| 26 | Correspondence [30-05-2016(online)].pdf | 2016-05-30 |
| 27 | 349-CHE-2008 DESCRIPTION(COMPLETE) 11-02-2009.pdf | 2009-02-11 |
| 27 | Claims [30-05-2016(online)].pdf | 2016-05-30 |
| 28 | 349-CHE-2008 DRAWING 11-02-2009.pdf | 2009-02-11 |
| 28 | 349-CHE-2008_EXAMREPORT.pdf | 2016-07-02 |
| 29 | 349-CHE-2008 FORM-1 11-02-2009.pdf | 2009-02-11 |
| 29 | 349-CHE-2008-HearingNoticeLetter.pdf | 2017-08-04 |
| 30 | 349-CHE-2008 FORM-2 11-02-2009.pdf | 2009-02-11 |
| 30 | 349-CHE-2008-FORM-26 [30-08-2017(online)].pdf | 2017-08-30 |
| 31 | 349-CHE-2008-Correspondence to notify the Controller (Mandatory) [30-08-2017(online)].pdf | 2017-08-30 |
| 31 | 349-CHE-2008 FORM-3 11-02-2009.pdf | 2009-02-11 |
| 32 | Correspondence by Agent_Power of Attorney_04-09-2017.pdf | 2017-09-04 |
| 32 | 349-CHE-2008 FORM-5 11-02-2009.pdf | 2009-02-11 |
| 33 | 349-CHE-2008-Written submissions and relevant documents (MANDATORY) [15-09-2017(online)].pdf | 2017-09-15 |
| 33 | 349-CHE-2008 DESCRIPTION(PROVISIONAL) 11-02-2008.pdf | 2008-02-11 |