Abstract: The present invention provides a system and a method for populating questions in a question paper based on a pre-defined difficulty level of the questions is disclosed. The questions are selected from a question bank based on probability of candidates from different categories that the candidates are able to answer the questions correctly. Further, the questions are selected randomly such that various set of questions are presented to candidates depending on the number of candidates and categories. The probability of the candidates answering particular question is captured and in case the candidates answering the question are high from the results, such most correctly answered questions are not presented to the candidates during the next question paper.
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
SYSTEM AND METHOD FOR POPULATING QUESTIONS IN A QUESTION
PAPER
Applicant
Tata Consultancy Services Limited A Company Incorporated in India under The Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021,
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
FIELD OF THE INVENTION
0001 The present invention relates to a field of question paper generation. More
particularly, the present invention relates to populating questions in a question paper
based on a pre-defined difficulty level of the questions.
BACKGROUND OF THE INVENTION
2 Automatic questions selecting method of an examination system on a large network generates questions from a question bank based on pre-defined pattern. The questions are selected based on the best matched with a question requirement to generate examination paper instantly.
3 The automatic questions selecting method requires attributes of the scope and difficulty of paper including selection of questions to generate and present the same to the candidates. The method may include integrating solutions of the questions and difficulty of the questions of various levels, questions that have been used earlier, difficulty of the question paper in total such that the question selection and matching degree between the paper and the question selection requirement can be improved.
4 Further, automatic question selecting method includes adaptive testing methods. Adaptive testing method uses the function of changing the problem difficulties by using a computer platform and a network system. The adaptive testing provides an online test in which the method automatically changes the difficulty of the problems based on correct answering ratio of the candidates in the examination. Depending on the correct answering ratio of the candidates, the difficulty of the problems may be presented with lower difficult level, higher difficulty level or other topic with varying questions.
5 Another approach in adaptive testing includes randomly choosing questions for candidates. In case the candidates correctly answer the questions, the system adopting adaptive testing evaluates the skill of the candidate using the difficulty level parameter
of the question and the answer. The system analyzes the answer received and searches for the question having difficulty level that is close to the estimate the skill of the candidate. If the candidate answers all the questions correctly, the difficulty level of next questions to be presented is set higher. Further, the difficult level is reduced incase the candidate answering correctly is low.
SUMMARY OF THE INVENTION
6 This summary is provided to introduce concepts related to system and method for populating questions in a question paper based on a pre-defined difficulty level of the questions and the concepts are further described below in the detailed description. This summary is not intended to identify essentiaf features of the cfaimecf subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter,
7 In one implementation, a system for populating questions in a question paper based on a pre-defined difficulty level of the questions is provided. The system comprises of a processor. The system further comprises of a memory coupled to the processor. The processor is capable of executing a plurality of modules stored in the memory. The plurality of module comprises a calculation module configured to determine an average behavior of the question by using a calculative mechanism, the average behavior further depicts a pattern of answering the question. The plurality of module further comprises a selection module configured to randomly select a question to further populate the bank with varying questions with respect to the average behavior so determined. The plurality of module further comprises an updating module configured to update in a regular manner a difficulty level of the questions in the question bank by using an updating mechanism.
8 In one implementation, a method for populating questions in a bank based on a pre-determineci difficulty level of the question is provided. The method comprises
steps of determining an average behavior of the question by using a calculative mechanism, the average behavior further depicts a pattern of answering the question. The method further comprises randomly selecting a question to further populate the bank with varying questions with respect to the average behavior so determined. The method further comprises updating in a regular manner a difficulty level of the questions in a bank by using an updating mechanism. The determining, the selecting and the updating are performed by a processor.
BRIEF DESCRIPTION OF THE DRAWINGS
9 The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
10 Figure I illustrates a network implementation of a system for populating questions in a question paper is shown, in accordance with an embodiment of the present subject matter.
11 Figure 2 illustrates the system, in accordance with an embodiment of the present subject matter.
12 Figure 3 illustrates a method for populating questions in a question paper, in accordance with an embodiment of the present subject matter.
13 Figure 4 illustrates loading and determining questions, in accordance with an embodiment of the present subject matter.
14 Figure 5 illustrates updating of difficulty level of question paper, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION OF THE INVENTION
15 Systems and methods for populating questions in a question bank based on a pre-determined difficulty level of the question are described. The questions are selected from a question bank based on probability of candidates from different category who are able to answer the questions correctly. Further, the questions are selected randomly such that various set of questions are presented to candidates depending on the number of candidates and categories. The probability of the candidates answering particular question is captured and in case the candidates answering the question are high from the results, such most correctly answered questions are not presented to the candidates in the next question paper.
16 The questions are generated while varying the difficulty level of the questions. Further, the difficulty level of the questions is captured and the difficulty level is continuously updated based on the answering pattern of the candidates. Further, probability of the difficulty level of the questions to be generated is defined and required question papers with suitable questions are selected and presented to the candidates.
17 While aspects of described system and method for populating questions in a question bank based on a pre-determined difficulty level of the questions may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
18 Referring now to Figure 1, a network implementation 100 of a system 102 for populating questions in a question bank based on a pre-determined difficulty level of the question is illustrated, in accordance with an embodiment of the present subject matter. In one embodiment, the system selects questions randomly, based on the probability of the candidates answering the questions correctly or incorrectly. The
system receives the probability of candidate answering the questions correctly and presents the questions accordingly.
19 The system generates questions with varying components of the questions. The system further updates the difficulty level of the questions continuously. The probability of the difficulty level of the questions to be generated is defined and required question papers with suitable questions are selected and presented to the candidates.
20 Although the present subject matter is explained by considering a scenario that the system 102 is implemented as an application on a server. It may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2... 104-N, collectively referred to as user 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The user devices 104 are communicatively coupled to the system 102 through a network 106.
21 In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
22 Referring now to Figure 2, the system 102 is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204. and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.
23 The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with a user directly or through the client devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server,,
24 The memory 206 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
25 The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data
types. In one implementation, the modules 208 may include a calculation module 210, a selection module 212, an updating module 214, and other modules 216. The other modules 216 may include programs or coded instructions that supplement applications and functions of the system 102.
26 The data 230. amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The data 230 may also include a system database 232 and other data 234. The other data 234 may include data generated as a result of the execution of one or more modules in the other module 216.
27 In one embodiment of the invention, referring to figure 2 and figure 3, the system 102 comprises the calculation module 210 configured to determine an average behavior of the question. The average behavior of the candidates is determined (step 302) based on the candidates answering the question correctly or incorrectly. Further, the average behavior depicts a pattern of answering the question. The questions are presented to candidates based on concepts as defined for the examination. The questions are generated with varying components in the question paper such that the answers to the questions are not logically answered based on the previous questions presented to the candidates.
28 The system 102 populates questions in a question paper based on a pre-defined difficulty level of the questions. The difficulty level of questions is determined by the probability of candidates answering the questing correctly. The difficulty level of the questions may be set as an easy level, a medium level or hard level. The easy level difficulty question paper may comprise questions that the candidates are more likely to answer the questions correctly. The medium level difficulty question paper may comprise questions that the candidates are moderate in answering the questions correctly. The hard level difficulty question paper may comprise questions that the candidates are more likely to not answer the questions correctly.
29 In one embodiment, the pattern of answering the question further provides at least one of a probability of answering the question correctly or a probability of answering the question incorrectly or a probability of not answering the question. The pattern of answering the question may be determined from previous results of the question papers answered by the candidates. In case the questions are less likely to be answered by the candidates when presented during the examination is more, such questions may be considered as difficult level. For the questions the candidates likely to answer is more, such questions may be considered as easy level. For the questions the candidates likely to answer is moderate, such questions may be considered as medium level.
30 The calculation module 210 determines average behavior of the questions by using the calculative mechanism. The calculative mechanism uses a statistical analysis data of previously answered questions to further determine the average behavior of the question. The average behavior of the questions being answered may be calculated in batches or in real-time. The questions are presented to the batches are collected from the candidates of one or more groups appearing for the examination at different intervals. In one embodiment, the average behavior may be calculated from the answering pattern of the candidates in one or more group to particular question. In another embodiment, the average behavior may be calculated from the answering pattern of the candidates in total to the question paper by one or more group of the candidates.
31 The statistical analysis data used to determine average behavior of the question based on previously answered questions may comprise calculating mean of the number of the candidates answering the question and number of the candidates correctly answering the question. Similarly, the average behavior is calculated by considering mean of the number of the candidates answering the question and number of the candidates incorrectly answering the question. The statistical analysis of data is also used to determine the mean number of candidates leaving a question unanswered.
32 In one embodiment, the system 102 comprises the selection module 212 configured to randomly select the question (step 304. figure 3) to further populate the question bank with varying questions with respect to the average behavior so determined. The selection module 212 selecting the questions from the question bank may be explained in following description. In one embodiment, there may be different type of questions in the question paper. Each type of question has one or more probability of the candidate answering the question and probability of not answering the question.
33 Referring to figure 4, the questions from the question bank is loaded (step 402) into the system 102. All the types of questions corresponding to probability of the candidate not attempting a question, answering the question correctfy and the probability of the candidate not answering the questions correctly are loaded in the system. Further, the number of patterns corresponding to each type of the questions is also loaded in the system 102. For each type of the questions, obtain specification for question paper (step 404). Compute mean answering probability for each question (step 406). Add question to the question bank with respect to the answering probability so computed (step 406). In the next step, compare a mean answering probability with specifications (step 408).
34 In one embodiment, the system 102 comprises the updating module 214 configured to update the difficulty level of the questions in a regular manner (step 306, figure 3) in the question bank by using an updating mechanism. The updating mechanism updates the difficulty level by deleting the question with a high probability of answering the question correctly and selecting a new question with respect to the predefined difficulty level. The updating mechanism is explained with the help of figure 5.
35 Referring to figure 5, the updating module 2l4updates the difficulty level of the question paper (step 502) based on the probability of the candidate answering the
question correctly. In case the questions the candidates not likely to answer correctly For example, the requirement is that, 50% of the candidates only should pass. Then the questions should be selected in such a way that the overall probability of any one getting more than pass marks is 50%. The first question is selected randomly from the question bank. If this question has more than 50% probability of answering, then the next question is chosen such that the probability of answering both questions is less than 50% by choosing a question with a lower probability of answering. This process is extended till either we find the required number of questions or we end up with no questions to choose (this will happen if for example, all the questions in the bank are easy and can be answered by more than 50% of the students or if all the questions in the bank are difficult so than the probability of any one answering the question is much less than 50%. The computation of the mean after adding each question uses "convolution" of distributions - a statistical procedure are not complete, the questions are filled randomly and the questions are updated accordingly (step 504).
36 The mean of the remaining questions are calculated (step 506) in order to select the remaining questions in the question bank. If average mean is more for questions that are not likely to answer is more or less then the question paper is declared infeasible. The system 302 further checks for feasibility of the remaining questions to be filled based on the probability determined. The system 102 checks for average mean in order to compute mean for remaining questions to be filled. The system 102 randomly selects the questions (step 508) from the feasibility when the questions are selected. The last questions are selected and targeted mean of the last questions are calculated by picking the closest questions that matches the difficulty level. The selected questions are presented (step 510) to the candidate.
37 Considering an example of implementing the system 102, assuming M types of questions in the question paper such as 1, 2, 3, etc. Each type of questions i may have probability ri, the student not answering the question and probability qi is the student answering the question correctly. The type of question / has total of ni questions of
which zero or more questions may be selected in the question paper. For example, the question paper may have total of N questions and user may choose ki questions from type i, where ki is less than or equal to ni.
38 In order to determine the pass percentage P which is a function of cutoff marks c, normal distribution is applied. p(c) is estimated using normal distribution where P is the probability of the candidate obtaining at least the cutoff c in the examination. Further, if Yk is the random variable defining the number of marks the candidate obtains in the test, if only k questions are provided and Y is equal to Yn is the random variable defining the number of marks the candidate gets in the full test.
39 For question type qj is the probability of the candidate answering the question correctly, and rj is the probability of the candidate not answering. For the question type j, let μj, and wj be the average mean and variance of the marks obtained by the candidate if the question of that type is set.
40 For populating questions in the question bank, normal appropriation is used. In order to determine the average behavior of the question, the probability of pattern of answering the question is depicted using cutoff c, mean μ and variance σ of the previous answered pattern. The pattern of answering the question is determined by the probability of answering the question correctly or a probability of answering the question incorrectly or a probability of not answering the question. The pattern of answering the question may be determined from previous results of the question papers answered by the candidates. Further, for the question type j. average mean μj and variance Wj of the marks obtained by the candidate if the question of j type is set. The mean μj is calculated using qj i.e., the probability of the candidate answering the question correctly and r, i.e., the probability of the candidate not answering. Further, the variance wj is calculated using qj i.e., the probability of the candidate answering the question correctly, rj-i.e., the probability of the candidate not answering and rL
41 In case huge numbers of questions are required to be selected and the variation of Wj is complex, the average mean of the variance is considered.
42 In order to choose the remaining questions with more or less difficulty based on the average behavior of the candidate answering the question correctly may be defined by average mean of the remaining questions in the question bank. The system 102 further checks for feasibility of the remaining questions to be filled based on the probability determined. The system 102 checks for average mean in order to compute mean for remaining questions to be filled. The system 102 randomly selects the questions from the feasibility when the questions are selected. The last questions are selected and targeted mean of the last questions are calculated by picking the closest questions that matches the difficulty level. The selected questions are presented to the candidate. Further, the remaining question may be selected randomly based on unsaturated type of the questions, which have mean alternatively above or below average mean such that the feasibility of selecting the remaining questions are suitable.
43 In order to choose the remaining questions with more or less difficulty based on the average behavior of the candidate answering the question correctly may be defined by determined by average mean of the remaining questions in the question bank. The system 102 further checks for feasibility of the remaining questions to be filled based on the probability determined. The system 102 checks for average mean in order to compute mean for remaining questions to be filled. The system 102 randomly selects the questions from the feasibility when the questions are selected. The last questions are selected and targeted mean of the last questions are calculated by picking the closest questions that matches the difficulty level. The selected questions are presented to the candidate. Further, the remaining question may be selected randomly based on unsaturated type of the questions, which have mean alternatively above or below average mean such that the feasibility of selecting the remaining questions are suitable.
44 The updating module 214 updates the difficulty level of the questions in a regular manner in the question bank by using an updating mechanism. The updating mechanism updates the difficulty level by deleting the question with a high probability of answering the question correctly and selecting a new question with respect to the predefined difficulty level.
45 The foregoing description of specific embodiments of the present disclosure has been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the disclosure to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the disclosure and its practical application, to thereby enable others skilled in the art to best utilize the disclosure and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the disclosure be defined by the claims appended hereto and their equivalents. The listing of steps within method claims do not imply any particular order to performing the steps, unless explicitly stated in the claim.
WE CLAIM:
1. A system for populating questions in a question paper based on a pre-defined difficulty
level of the questions, the system comprising:
a processor;
a memory coupled to the processor, wherein the processor is capable of executing a plurality of modules stored in the memory, and wherein the plurality of module comprising:
a calculation module configured to determine an average behavior of the question by using a calculative mechanism, the average behavior further depicts a pattern of answering the question;
a selection module configured to randomly select a question to further populate the bank with varying questions with respect to the average behavior so determined; and
an updating module configured to update in a regular manner a difficulty level of the questions in the question bank by using an updating mechanism.
2. The system of claim 1, wherein the pre-defined difficulty level further comprises of an easy level, a medium level or a hard level.
3. The system of claim 1, wherein the pattern of answering the question further provides at least one of a probability of answering the question correctly or a probability of answering the question incorrectly or a probability of not answering the question.
4. The system of claim 1, wherein the calculative mechanism uses a statistical analysis data of previously answered questions to further determine the average behavior of the question.
5. The system of claim 1, wherein the updating mechanism updates the difficulty level by deleting the question with a high probability of answering the question correctly and selecting a new question with respect to the predefined difficulty level.
6. The system of claim 1, wherein randomly selected questions ensure a selection of varying question in one or more question banks.
7. A method for populating questions in a bank based on a pre-determined difficulty level of the question, the method comprising:
determining an average behavior of the question by using a calculative mechanism, the average behavior further depicts a pattern of answering the question;
randomly selecting a question to further populate the bank with varying questions with respect to the average behavior so determined; and
updating in a regular manner a difficulty level of the questions in a bank by using an updating mechanism;
wherein the determining, the selecting and the updating are performed by a processor.
8. The method of claim 7, wherein pre-defined difficulty level further comprises of an easy level, amedium level or a hard level.
9. The method of claim 7, wherein the pattern of answering the question further provides at least one of a probability of answering the question correctly or a probability of answering the question incorrectly.
10. The method of claim 7, wherein the calculative mechanism uses a statistical analysis
data of previously answered questions to further determine the average behavior of the
question.
11. The method of claim 7. wherein the updating mechanism updates the difficulty level by deleting the question with a high probability of answering the question correctly and selecting a new question with respect to the predefined difficulty level.
12. The method of claim 7, wherein randomly selected questions ensure a selection of varying question in one or more question banks.
| # | Name | Date |
|---|---|---|
| 1 | ABSTRACT1.jpg | 2018-08-11 |
| 2 | 1934-MUM-2013-FORM 3.pdf | 2018-08-11 |
| 3 | 1934-MUM-2013-FORM 26(6-9-2013).pdf | 2018-08-11 |
| 4 | 1934-MUM-2013-FORM 2.pdf | 2018-08-11 |
| 5 | 1934-MUM-2013-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 6 | 1934-MUM-2013-FORM 18.pdf | 2018-08-11 |
| 7 | 1934-MUM-2013-FORM 1.pdf | 2018-08-11 |
| 8 | 1934-MUM-2013-FORM 1(25-6-2013).pdf | 2018-08-11 |
| 9 | 1934-MUM-2013-DRAWING.pdf | 2018-08-11 |
| 10 | 1934-MUM-2013-DESCRIPTION(COMPLETE).pdf | 2018-08-11 |
| 11 | 1934-MUM-2013-CORRESPONDENCE.pdf | 2018-08-11 |
| 12 | 1934-MUM-2013-CORRESPONDENCE(6-9-2013).pdf | 2018-08-11 |
| 13 | 1934-MUM-2013-CORRESPONDENCE(25-6-2013).pdf | 2018-08-11 |
| 14 | 1934-MUM-2013-CLAIMS.pdf | 2018-08-11 |
| 15 | 1934-MUM-2013-ABSTRACT.pdf | 2018-08-11 |
| 16 | 1934-MUM-2013-FER.pdf | 2019-04-30 |
| 17 | 1934-MUM-2013-OTHERS [29-10-2019(online)].pdf | 2019-10-29 |
| 18 | 1934-MUM-2013-FER_SER_REPLY [29-10-2019(online)].pdf | 2019-10-29 |
| 19 | 1934-MUM-2013-DRAWING [29-10-2019(online)].pdf | 2019-10-29 |
| 20 | 1934-MUM-2013-COMPLETE SPECIFICATION [29-10-2019(online)].pdf | 2019-10-29 |
| 21 | 1934-MUM-2013-CLAIMS [29-10-2019(online)].pdf | 2019-10-29 |
| 22 | 1934-MUM-2013-US(14)-HearingNotice-(HearingDate-09-12-2021).pdf | 2021-11-16 |
| 23 | 1934-MUM-2013-Response to office action [09-12-2021(online)].pdf | 2021-12-09 |
| 1 | 1934_MUM_2013_search_29-04-2019.pdf |