Sign In to Follow Application
View All Documents & Correspondence

An Automated System For Recruitment

Abstract: Systems and methods for automated recruitment of employees are disclosed. In one implementation, an automated recruitment system includes a test engine that generates a plurality of tests and evaluates applicants" responses for consistency based on parameters and metrics specific to diverse domains. The test engine provides an online application form to one or more applicants. The applicants undergo various tests, such as an aptitude test, an ability test, and a team skills test. The various tests are dynamically generated. The responses are evaluated to generate mental maps of the applicants. Applicants are shortlisted based on an analysis of the mental maps.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 February 2008
Publication Number
37/2009
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

TATA CONSULTANCY SERVICES
CHENNAI ONE-SEZ UNIT (ETL INFRASTRUCTURE SERVICES LTD) 200 FT. THORAIPAKKAM-PALLAVARAM RING ROAD, CHENNAI 600096

Inventors

1. SRINIVASAN RAMAN
CHENNAI ONE-SEZ UNIT (ETL INFRASTRUCTURE SERVICES LTD) 200 FT. THORAIPAKKAM-PALLAVARAM RING ROAD, CHENNAI 600096
2. SUBRAMANIAM PREMA
CHENNAI ONE-SEZ UNIT (ETL INFRASTRUCTURE SERVICES LTD) 200 FT. THORAIPAKKAM-PALLAVARAM RING ROAD, CHENNAI 600096
3. SRIDHAR PRIYADHARSHINI
CHENNAI ONE-SEZ UNIT (ETL INFRASTRUCTURE SERVICES LTD) 200 FT. THORAIPAKKAM-PALLAVARAM RING ROAD, CHENNAI 600096
4. SREEKUMAR
CHENNAI ONE-SEZ UNIT (ETL INFRASTRUCTURE SERVICES LTD) 200 FT. THORAIPAKKAM-PALLAVARAM RING ROAD, CHENNAI 600096

Specification

TECHNICAL FIELD The present subject matter relates to a system for recruitment for an organization based on analysis of skill sets of applicants. In particular, the present subject matter relates to a system for assessment of competency based on analysis of the skill sets of the applicants. BACKGROUND Generally, organizations recruit human resources by employing conventional methods of recruitment, which proceed through multiple stages. A first stage of a typical recruitment or selection process in an organization, involves advertising for a vacant position through media such as newspapers and journals. The organization may also opt to place an online advertisement over the Internet. In response to the advertisements, the organization may receive resumes from a large number of applicants. Thereafter the resumes are reviewed by the representatives of the organization, predominantly by the members of the human resource (HR) management team. After reviewing the resumes of applicants, the organization invites eligible applicants for the next stage of the selection process. The next stage may include a written test, an interview or both. Based on the performance of an applicant in the test and the interview, the organization decides whether the applicant is suitable for the position or not. The organizations that recruit in large numbers usually employ a recruitment agency to assist the HR team in the recruitment process. The recruitment agency helps the HR team in conducting the selection process including conducting search for applicants and organizing tests. The tests can be general written tests having questions for analyzing various skills of the applicants, such as analytical, reasoning, communication, domain specific knowledge etc. However, the questions asked in these selection tests are standard questions, i.e., they cannot be dynamically chosen. Hence, there is a significant probability that the questions are available to the applicant beforehand or the applicant may answer them correctly by using guesswork. Thus, it becomes difficult to check the competency level of the applicant. Moreover, manual processing of a large number of applications and test questions often results in inefficiencies and errors. It has been generally noticed that a significant proportion of the employee population hired using such recruitment methods arc less competent to perform the required job role. In addition, in such a method of recruitment, representatives of the organization have to spend a considerable amount of time in posting and reposting advertisements, replying to the applicants who respond to the advertisements, arranging for the tests, interviews and other related work. Thus, efforts have been made to lessen the human intervention in the recruitment process by carrying out screening and selection of applications over a computer network using techniques such as emails, texts messages, and so on. Patent application No. US 2004/0064329 describes an employment application system based on a computer network, which facilitates online application by the applicants. However, the recruitment process still requires a large degree of human intervention for assessment of the competencies and is subject to judgmental and human errors and bias. Thus, a new comprehensive approach involving increased objectivity and minimal subjective and manual assessment by individuals is required for quickly & reliably identifying suitable competent applicants. SUMMARY The subject matter described herein is directed to a system for recruitment based on competency assessment. According to an embodiment of the present subject matter, the recruitment system includes a test engine that generates multiple tests and evaluates responses to check for consistency of answer patterns based on parameters and metrics related to various knowledge and skill domains. The test engine provides online application forms to applicants desirous of applying for a job in an organization. The applicants are given various tests, for example, an aptitude test, an ability test, and a team skills test. The responses are evaluated using various mapping, comparative, and analytical techniques based on which the job competencies of an applicant can be determined. The successful applicants i.e. the applicants having the desired job competencies are then short-listed for the required job. These and other features, aspects, and advantages of the present subject matter will be better understood with reference to the following description and appended claims. This Summary is provided to introduce a selection of concepts in a simplified form. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. BRIEF DESCRIPTION OF THE DRAWINGS The above and other features, aspects, and advantages of the subject matter will be better understood with regard to the following description, appended claims, and accompanying drawings where: Fig. 1 illustrates an exemplary network for implementing a system for assessment and recruitment of candidates. Fig. 2 is a block diagram illustrating an exemplary server. Fig. 3 illustrates an exemplary block diagram showing the interaction between different modules in the exemplary server. Fig. 4 illustrates an exemplary process for selection and recruitment of candidates. Fig. 5 illustrates of an exemplary process for predictive test generation and evaluation. Fig. 6 illustrates an exemplary process for ability test generation and evaluation. DEI AILED DESCRIPTION The subject matter described herein is directed to a system and method for recruitment of candidates based on competency assessment using knowledge sets such as functional and domain knowledge and multiple skill sets such as functional, domain, logical, analytical, numerical, and team skills. In one implementation, the system generates and evaluates one or more tests for evaluating aptitude skills of applicants based on several parameters and metrics corresponding to various knowledge and skill domains. A knowledge domain may be a specific professional field such as law, engineering, medicine, and so on. The questions pertaining to a particular knowledge domain may be specific to that domain, for example, in a job related to computers, the questions may be directed towards programming, computer networking, computer architecture, etc., or may be generic, based on the requirements for the job. On the other hand, skill domains may include skills which are either inherent in an individual or are honed by someone with experience, over a period of time, for example, communication skills, analytical skills, decision-making skills, emotional consistency, etc. The disclosed system for recruitment is based on a test engine that includes a dynamic test generator and a dynamic evaluation module. The test generator dynamically produces a series of test questions related to various domains, and the evaluation module evaluates responses received from the applicants. For example, a set of predictive test questions can be generated in real time as an applicant takes a particular test and responds to the questions included in the test. The responses from various applicants are received and are evaluated to determine the consistency in the pattern of choices made by each applicant. The system then generates a mental model map for each applicant and compares the mental model map with a threshold or an acceptable mental model map for the required job profile. The threshold or acceptable mental model map can be a previously generated model map based on responses received from a sample of desirable or competent and undesirable or incompetent applicants. Based on the comparison, the system categorizes whether the applicant has a desired level of job competency or not and shortlists one or more applicants. All the short listed applicants can be invited for another round of tests for evaluating additional skills and competencies, for example, logical, numerical, quantitative, etc., of the applicants. The applicants successful in this round can further be evaluated for other abilities and skills like team performance skills, communication skills, leadership skills and the like. In one implementation, the evaluation of team performance skills can be done through an online group activity, for example, completing a jigsaw puzzle. In order to conduct the above-mentioned tests, the system provides an interlace to all the applicants at their remote computers. The remote computers are connected to a network, such as the Internet, using a browser software. The remote computers can further be connected to one or more input objects, for example, a keyboard, a mouse, a microphone etc. The actions of the input objects can be monitored and recorded to receive and evaluate the responses. Further, the applicants clearing the tests with a satisfactory level of performance may be called for an interview with an expert panel. The expert panel then may give their inputs on each applicant by filling in an evaluation template provided by the system. According to the data filled in by the expert panel in an evaluation form, a final list of applicants is prepared by the system. The disclosed system thus includes a diverse toolbox of flexible strategies designed to predict professional compatibility, job competencies, as well as to test various skills such as logical, analytical, and numerical skills of the applicants aspiring to work for an organization, with a relatively low error rate. While aspects of described systems and methods for a system for assessment and recruitment can be implemented in any number of different computing systems, environments, and/or configurations, embodiments of system analysis and management arc described in the context of the following system architecture(s). An Kxemplary Network Environment Fig. 1 shows an exemplary network environment 100 illustrating interactions for an application process between a computing system and one or more applicants using one or more remote computers connected to the computing system over a network. The network environment 100 includes a computing system 102, a web server 104, a network 106, and one or more applicants 108, for example, 108-1, 108-2, 108-3, ..., 108-n. The computing system 102 can be implemented as a server, and is henceforth referred to as server 102. The web server 104 acts as a communication link between the applicants 108 and the server 102. For this, the web server 104 communicates with the server 102 via the network 106. The web server 104 also communicates with the applicants 108 through the network 106. In one implementation, the network environment 100 may include, for example, a company network, having a number of office PCs, various server computers, and other computing-based devices spread across different places or countries. Alternatively, in another implementation, the network environment 100 may include a smaller network with a limited number of PCs. Further, the network 106 may be a wireless or a wired network, or a combination thereof. The network 106 may also be a collection of individual networks, interconnected with each other and functioning as a single large network (e.g., the Internet or an intranet). Examples of such individual networks include, but arc not limited to Local Area Networks (LANs), Wide Area Networks (WANs), and Metropolitan Area Networks (MANs). Further, the network 106 connecting the web server 104 and the applicants 108 may implement a transmission protocol such as transmission control protocol over Internet protocol (TCP/IP). The server 102 can communicate with the applicants 108 via the web server 104 using one or more protocols such as Remote Desktop Protocol (RDP). The web server 104 provides a user interface for the application process, which may include, for example, a URL, a HTML page, etc., which may be hosted on the server 102. The web server 104 can transmit one or more questionnaires from the server 102 to the applicants 108, and can receive responses from the applicants 108 to be sent eventually to the server 102. The web server 104 thus acts as a communication link between the applicants 108 and the server 102. It will be understood that the web server 104 can be a part of, or directly linked to the server 102, or may be placed in a different location. For example, the web server 104 may be located at job fairs or any other similar place to allow the applicants 108 to apply from any location. Further, the server 102 implements the system for recruitment. Tn one implementation, the server 102 may include a test engine 110 for selecting candidates with competencies suitable for an organization. The test engine 110 may be used in any organization seeking to fill vacant positions, operating in any sector including, but not limited to, technology, education, legal services, and healthcare. The test engine 110 automates the recruitment process by online assessment of the applicants 108. The test engine 110 can be a screening and selection tool that may include an online environment for comparing job competencies, matching requirements for new employment positions, and determining the applicability of the applicants 108 for such new positions. The test engine 110 may carry out the complete recruitment process in one or more stages. Towards this end, the test engine 110 may include modules such as a test generator module 112 and an evaluation module 114 for the assessment of the applicants 108. In one implementation, the test generator module 112 first produces an application form to be tilled by the applicants 108. The application form can be based on a digital form template stored in a database of the system. The digital application form helps applicants 108 in applying online by filling in their details in the application form, for example, the applicants 108 can be asked to fill in initial data including personal and professional details in the form. For the purpose, the form can have editable fields that may accept alphanumerical data. The fields that receive personal details can be non-process type inputs. However, many other inputs such as qualification and experience can be processed. The test engine 112 can have pre-determined thresholds for the processed inputs. The processed inputs given by the applicants 108 can be compared with the predetermined thresholds set by the test engine 110 to analyze each application form. After comparison, the analyzed application forms may be sent back to the respective applicants 108. When sent back, each application form can be integrated with a predictive test for evaluating skills such as the aptitude of each of the applicants 108. The test generator module 112 generates question sets for the online predictive test based on the processed inputs of the applicants 108. Thus, different applicants may receive different lest questions based on their academic qualifications, experience, position applied for, etc. For example, for a software engineer, the predictive test can be tailored for evaluating the programming aptitude of the applicant. Similarly, for example, an applicant 108-1 with a medical background will receive the predictive test tailored accordingly. The test generator module 112 further provides one or more templates, which may be pre-defined, to accommodate different questions, suitable options, and the order in which the options are displayed. The templates may be varied dynamically to ensure that the probability of predicting the answers becomes nearly zero, and that the applicants 108 cannot score without actually understanding the problem and working out a solution. The answer choices made by the applicants 108 are evaluated by the evaluation module 114. The evaluation module 114 can analyze the pattern of answer choices given by the applicants 108 to generate a mental map for each of the applicants 108. The mental map is based on the mental model theory of thinking and reasoning, a concept known in the psychological sciences. The mental models are generally used to determine an individual's perception of the real world, the surrounding environment, and behavior in different situations. These mental models thus help in predicting the response of an individual in various situations that follows a particular pattern, known as meta patterns, in each situation. Further evaluation can be based on the consistency of meta patterns of the mental maps of the applicants 108, instead of being based on the direct score of the questions answered by the applicants 108. Thus the server 102 does not depend on traditional objective scoring methods of evaluation based on the number of correct or incorrect responses. Instead the server 102 uses complex analytical methods to generate mental model maps and use meta patterns to assess the competency of an applicant. Based on the mental maps, the evaluation module 114 categorizes the applicants 108 into predefined categories. The predefined categories may include categories that reflect the aptitude of each of the applicants 108 such as good, probably good, and unsuitable. If the applicants 108 fall in an acceptable category, the applicants 108 may then be notified to continue in the recruitment process and are given an ability test and a team skill test. Else, the applicants 108 may be notified of their rejection. The evaluation module 114 can further prepare a bell curve analysis of the performance of the candidates 108. A bell curve analysis is an art known in the field of statistical analysis and is particularly used for calculating normal distributions. Using bell curve analysis, many measurements, ranging from psychological to physical phenomena can be analyzed and approximated to varying degrees. Thus using the bell curve analysis, normal distribution and its different parameters such as mode, median, and average of the results of the candidates can be obtained. In addition, other statistical analysis, such as calculating percentile results of the applicants 108, can be conducted. The above process, being a technical process based on scientific principles, yields better results than manual recruitment methods. In one implementation, the test engine 110 can further generate a second stage test or an interview appointment stating date, time, and venue for further evaluation of the shortlisted applicants out of the applications 108. The shortlisted applicants can be sent an email with a schedule as mentioned earlier for the forthcoming tests such as an ability test and a team skill test. In another implementation, the applicants 108 can take the ability test and the team skills test immediately after successfully completing the predictive test and the interview can be conducted at a later date. In yet another implementation, the applicants 108 may take a test that is a combination of one or more of the predictive test, the ability test, and the team skills test. An Kxcmplary Server Computer Fig. 2 illustrates the exemplary server 102. The server 102 may be implemented as one of a number of nodes in a website, an organizational intranet, a local area network, or as a separate computer not included as one of the nodes. In one implementation, the server 102 includes one or more processor(s) 202, network interfaces 204, and a memory 206. The processor(s) 202 may be implemented as one or more microprocessors, microcomputers, dual core processors, and so forth. Among other capabilities, the proccssor(s) 202 can be configured to fetch and execute computer readable instructions stored in the memory 206. The network interfaces 204 enable the server 102 to communicate to other computing-based devices, such as the web server 104 and applicants 108, over the network 106. The network interlaces 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks (e.g. LAN, cable, etc.) and wireless networks (e.g. WLAN, cellular, satellite, etc.). For the purpose the network interfaces 204 may include one or more ports for connecting a number of computing devices to each other or to another server computer. The memory 206 may include any computer-readable medium known in the art, for example, volatile random access memory (e.g., RAM) and non-volatile read-only memory (e.g., ROM, Hash memory, etc.). As illustrated in Fig. 2, the memory 206 may include program modules 208 and program data 210. The program modules 208 generally include routines, programs, objects, components, and data structures., that perform particular tasks or implement particular abstract data types. In one implementation, the program modules 208 include the test engine 110 having the test generator module 112, the evaluation module 114, a database module 212. and other modules 214. The other modules 214 may include programs that supplement applications on a computing-based device, such as an operating system. As described above, the server 102 implements the system for recruitment and includes the test engine 110. The test engine 110 shortlists applicants 108 over the computer network 106 by generating and then evaluating a number of tests using the test generator module 112 and the evaluation module 114. The test generator module 112 generates various tests for the assessment of the applicants 108. The test generator module 112 can interact with the database module 212 to generate the tests. The database module 212 can be a central repository that stores narratives, images, questions, answers, options for answers, as well as symbols and notations. The database module 212 can also store answers for a given instance of a test. In an implementation, the test generator module 112 fetches and stores test-related information for each instance of a test, from test generation data 216 stored in the program data 210. The evaluation module 114 fetches and stores information from evaluation data 218. The evaluation data 218 stores information relating to the pattern of choices made by the applicants 108 for a given instance of a test. The evaluation module 114 categorizes the applicants 108 into one or more pre-defined categories based on the evaluation data 218. The testing and evaluation of the applicants 108 is further described in detail below with reference to Fig 3. Figure 3 is an exemplary block diagram showing the interaction between the various modules in the exemplary server 102. As indicated previously, the test engine 110 includes the test generator module 112 and the evaluation module 114. The test generator module 112 can generate a plurality of tests for the assessment of numerical, logical, analytical or any other skill of the applicants 108. For example, the test generator module 112 may include a predictive test generator 302, an ability test generator 304, and a team skills test generator 306. Accordingly, the predictive test generator 302 can generate predictive test(s), the ability test generator 304 can generate ability test(s), and the team skills test generator 306 can generate team skill test(s). The evaluation test module 114 can evaluate responses to the one or more tests attempted by the applicants 108 using different methods. For example, the predictive test can be evaluated by a predictive test evaluator 308, the ability test can be evaluated by an ability test evaluator 310, and the team skills test can be evaluated by the team skills test evaluator 312. To generate and evaluate the tests, the test engine 110 communicates with the database module 212. The database module 212 may include a context for dynamic problems 314, dynamic answer options 316, dynamic symbols 318, and dynamic values 320. In one implementation, the dynamic values 320 for a test can be represented by a set of alphanumeric values, numeric values, notations, and symbols. Different problems or questions are retrieved dynamically for every instance of a test. For example, the test engine 110 generates each problem in a test dynamically from a set of dynamic problems 314. The set of dynamic problems 314 can be associated with the database module 212 for a particular instance of the test. Each of the dynamic problems 314 can have multiple dynamic answer options 316. The value sets of different answer options can be changed dynamically. Accordingly, as the applicants 108 proceed with the online test, the test generator module 112 generates problem questions. Multiple answer options may be presented, the order of which can be dynamically varied. Visual options for any test can be handled through a set of stored images, notations, and/or values that can be selected for a specific question or answer response. The symbols or notations and values assigned for the questions and answers in the online test(s) generated by the test generator module 112 can be stored in the dynamic symbols 318 and the dynamic values 320, respectively. In one implementation, the predictive test generator 302 generates the predictive test that includes questions based on the academic qualification of the applicants 108. For example, if the academic qualification of an applicant, for example, the applicant 108-1, is related to legal services, then the applicant 108-1 will receive questions based on law. The predictive test can then be sent by each of the applicants 108 for evaluation. In one implementation, the applicants 108 are assessed based on the evaluation of the online application form. In an implementation, the evaluation module 114 assesses the online analyzed application form that includes questions pertaining to the predictive test filled in by the applicants 108. The predictive test evaluator 308 of the evaluation module 114 evaluates the answer choices of the predictive test selected by the applicants 108 in the online analyzed application form. The predictive test evaluator 308, as described earlier, evaluates the answer choices based on the consistency of patterns of mental maps and not on the direct score of the predictive test. The mental map can represent an approach that an applicant has taken to solve a particular category of similar problems. For example, let a and b be two operators assigned any numerical values, the applicant 108-1 can interpret the values of a and b after an assignment like a=b in many ways such as: (a) The applicant 108-1 can assume that a is equal to value of b, i.e.: a<- b, or(b) The applicant 108-1 can assume that b is equal to value of a, i.e.: b<-a. A number of suitable applicants are identified from the applicants 108 based on an evaluation of the responses of each of the applicants 108 in the predictive test. In one implementation, the predictive test evaluator 308 identifies the suitable applicants based on the evaluation of mental maps. To this end, the predictive test evaluator 308 compares the mental maps evaluated for the applicants 108 with pre-defined categories of mental maps. The pre-defined categories of mental maps may include categorical mental maps of successful and unsuccessful applicants who previously appeared for the predictive test. Alter comparing the mental maps of each of the applicants 108 with the prc-dclincd categories, each of the applicants 108 can be identified either as successful or unsuccessful applicants. Successful applicants can then be filtered by the predictive test cvalualor 308. For example, out of the applicants 108-1,108-2, 108-3,..., 108-n, let the applicants 108-1, 108-2,..., 108-m (m

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 348-CHE-2008 FORM-1 04-06-2008.pdf 2008-06-04
1 Written submissions and relevant documents [26-06-2017(online)].pdf 2017-06-26
2 348-CHE-2008 CORRESPONDENCE OTHERS 04-06-2008.pdf 2008-06-04
2 348-CHE-2008-HearingNoticeLetter.pdf 2017-06-08
3 348-CHE-2008_EXAMREPORT.pdf 2016-07-02
3 348-CHE-2008 DESCRIPTION (COMPLETE) 10-02-2009.pdf 2009-02-10
4 Claims [18-04-2016(online)].pdf 2016-04-18
4 348-CHE-2008 CLAIMS 10-02-2009.pdf 2009-02-10
5 Correspondence [18-04-2016(online)].pdf 2016-04-18
5 348-CHE-2008 CLAIM 10-02-2009.pdf 2009-02-10
6 Description(Complete) [18-04-2016(online)].pdf 2016-04-18
6 348-CHE-2008 ABSTRACT 10-02-2009.pdf 2009-02-10
7 Examination Report Reply Recieved [18-04-2016(online)].pdf 2016-04-18
7 348-CHE-2008 FORM-18 02-02-2010.pdf 2010-02-02
8 OTHERS [18-04-2016(online)].pdf 2016-04-18
8 348-che-2008-form 3.pdf 2011-09-02
9 348-che-2008-form 1.pdf 2011-09-02
9 Correspondence [07-01-2016(online)].pdf 2016-01-07
10 348-che-2008-drawings.pdf 2011-09-02
10 Description(Complete) [07-01-2016(online)].pdf 2016-01-07
11 348-che-2008-description(provisional).pdf 2011-09-02
11 Examination Report Reply Recieved [07-01-2016(online)].pdf 2016-01-07
12 0348-che-2008 correspondence-others.pdf 2011-09-02
12 348-che-2008-correspondnece-others.pdf 2011-09-02
13 0348-che-2008 drawings.pdf 2011-09-02
13 348-che-2008-abstract.pdf 2011-09-02
14 0348-che-2008 form-3.pdf 2011-09-02
14 0348-che-2008 power of attorney.pdf 2011-09-02
15 0348-che-2008 form-5.pdf 2011-09-02
16 0348-che-2008 form-3.pdf 2011-09-02
16 0348-che-2008 power of attorney.pdf 2011-09-02
17 348-che-2008-abstract.pdf 2011-09-02
17 0348-che-2008 drawings.pdf 2011-09-02
18 348-che-2008-correspondnece-others.pdf 2011-09-02
18 0348-che-2008 correspondence-others.pdf 2011-09-02
19 348-che-2008-description(provisional).pdf 2011-09-02
19 Examination Report Reply Recieved [07-01-2016(online)].pdf 2016-01-07
20 348-che-2008-drawings.pdf 2011-09-02
20 Description(Complete) [07-01-2016(online)].pdf 2016-01-07
21 348-che-2008-form 1.pdf 2011-09-02
21 Correspondence [07-01-2016(online)].pdf 2016-01-07
22 348-che-2008-form 3.pdf 2011-09-02
22 OTHERS [18-04-2016(online)].pdf 2016-04-18
23 348-CHE-2008 FORM-18 02-02-2010.pdf 2010-02-02
23 Examination Report Reply Recieved [18-04-2016(online)].pdf 2016-04-18
24 348-CHE-2008 ABSTRACT 10-02-2009.pdf 2009-02-10
24 Description(Complete) [18-04-2016(online)].pdf 2016-04-18
25 Correspondence [18-04-2016(online)].pdf 2016-04-18
25 348-CHE-2008 CLAIM 10-02-2009.pdf 2009-02-10
26 Claims [18-04-2016(online)].pdf 2016-04-18
26 348-CHE-2008 CLAIMS 10-02-2009.pdf 2009-02-10
27 348-CHE-2008_EXAMREPORT.pdf 2016-07-02
27 348-CHE-2008 DESCRIPTION (COMPLETE) 10-02-2009.pdf 2009-02-10
28 348-CHE-2008-HearingNoticeLetter.pdf 2017-06-08
28 348-CHE-2008 CORRESPONDENCE OTHERS 04-06-2008.pdf 2008-06-04
29 Written submissions and relevant documents [26-06-2017(online)].pdf 2017-06-26
29 348-CHE-2008 FORM-1 04-06-2008.pdf 2008-06-04