Abstract: ABSTRACT A descriptive answer sheet evaluation system (100) based on AI/ML is disclosed. 5 Further, the system (100) comprises a memory (108). Further, at least one processor (106) is configured to receive a first input from a user through a user interface (104) installed within the computing unit (102). Further, analyse and compare the first input to a user database to determine a teacher profile or a student profile. Further, receive a second input from the user via the user interface (104). Further, receive a 10 third input from the user via the user interface (104). Further, convert and upload scanned images of an attempted answers into a digital text using a computer vision. Further, analyse the digital text using a trained machine learning/artificial intelligence model (110) to evaluate the attempted answers. Further, assign scores based on the evaluated answers to generate report and feedback for students. 15
Description:DESCRIPTIVE ANSWER SHEET EVALUATION SYSTEM AND METHOD USING ARTIFICIAL INTELLIGENCE/MACHINE LEARNING (AI/ML)
FIELD OF THE INVENTION 5
[0001] This invention generally relates to a field of educational technology and, in particular relates to a system and method for evaluating descriptive answer sheet based on artificial intelligence/machine learning (AI/ML).
BACKGROUND 10
[0002] The subject matter discussed in the background section should not be assumed to be prior art merely as a result of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the background section 15 merely represents different approaches, which in and of themselves may also correspond to implementations of the claimed technology.
[0003] The manual method of grading handwritten answer scripts is a time-consuming and labour-intensive process that often involves multiple steps, including reading each answer carefully, assessing the correctness and relevance of 20 the content, and assigning marks accordingly. Further, the process may be particularly challenging when dealing with a large volume of scripts, leading to potential delays in the evaluation process. Additionally, human graders, despite their best efforts, are susceptible to errors and inconsistencies due to fatigue, subjective judgment, and cognitive biases. Further, the factors may result in discrepancies and 25 inaccuracies in the grading, affecting the fairness and reliability of the evaluation. Therefore, the traditional manual grading system, while thorough, is prone to inefficiencies and errors that could impact the overall assessment quality.
[0004] Furthermore, the conventionally known Optical Mark Recognition (OMR) sheet evaluation system, while faster than manual grading, has its own set of 30
3
drawbacks. Further, the OMR based evaluation system is primarily limited to multiple-choice questions, restricting the diversity of question types that may be assessed. Further, OMR sheets require precise marking, and any ambiguity or stray marks may lead to errors in reading the responses, potentially resulting in incorrect scoring. Additionally, this system necessitates specialized scanning equipment and 5 software, which may be costly and require regular maintenance. The reliance on technology also poses a risk of technical failures or malfunctions, which could disrupt the evaluation process and delay results.
[0005] According to a patent application “US11790641B2” titled “Answer evaluation method, answer evaluation system, electronic device, and medium” 10 discloses an Answer evaluation method, answer evaluation system, electronic device, and medium. The present invention provides an answer evaluation method, an answer evaluation system, an electronic device, and a medium. The method comprises: acquiring an answer image, for a test paper answered by a use; classifying the answer image based on a pre-trained test question classification 15 model, so as to obtain an objective question answer area and a subjective question answer area; identifying at least one objective question in the objective question answer area and an objective question answers for each of the at least one objective question; identifying at least one subjective question in the subjective question answer area and a subjective question answers for each of the at least one subjective 20 question; and determining a total score value of the test paper based on the objective question, the objective question answer, the subjective question and the subjective question answer.
[0006] According to a patent application “JP2003178171A” titled “Answer evaluating system” discloses Answer evaluating system. A manager hands over 25 the answer sheet to an answerer after precedently recording the identification information for specifying each answer sheet in the answer sheet by a bar code or the like. After receiving the answer sheet from the answerer, the manager specifies the answer sheet by reading the identification information from the answer sheet. The manager precedently allocates an amount to be processed to a 30
4
predetermined answer processing person, determines association of distributing which answer sheet to the predetermined answer processing person in response to the allocated amount to be processed while rereading the identification data of the received answer sheet, and distributes it to the answer processing person. The answer processing person sends an evaluated result per each answer sheet to 5 the manager via a computer network and sends the evaluated answer sheet back to the manager. The manager approves all evaluation processes of the answer sheets and tabulates evaluations of the received answer sheets.
[0007] However, the conventionally known Optical Mark Recognition (OMR) sheet evaluation system may have several drawbacks. Further, the OMR based 10 evaluation system is primarily limited to multiple-choice questions, restricting the diversity of question types that may be assessed. Further, OMR sheets require precise marking, and any ambiguity or stray marks may lead to errors in reading the responses, potentially resulting in incorrect scoring.
5
OBJECTIVES OF THE INVENTION
[0008] The objective of invention is to provide a descriptive answer sheet evaluation system based on artificial intelligence/machine learning (AI/ML).
[0009] The objective of invention is to provide a method for evaluating a descriptive answer sheet based on artificial intelligence/machine learning (AI/ML). 5
[0010] Furthermore, the objective of present invention is to provide the descriptive answer sheet evaluation system based on artificial intelligence/machine learning (AI/ML) that is capable of converting handwritten text into a digital format.
[0011] Furthermore, the objective of present invention is to provide the descriptive answer sheet evaluation system based on artificial intelligence/machine learning 10 (AI/ML) that is capable of removing unnecessary characters.
[0012] Furthermore, the objective of present invention is to provide the descriptive answer sheet evaluation system based on artificial intelligence/machine learning (AI/ML) that is capable of assigning scores to the evaluated answers based on the predefined rubrics. 15
6
SUMMARY
[0014] According to an aspect, the present embodiments a descriptive answer sheet evaluation system based on artificial intelligence/machine learning (AI/ML). Further, the system comprises a memory. Further, at least one processor may be communicatively coupled with the memory. Further, the at least one processor is 5 configured to receive a first input from a user through a user interface installed within the computing unit, wherein the first input comprises a login credentials. Further, analyse and compare the first input to a user database to determine a teacher profile or a student profile. Further, receive a second input from the user via the user interface. Further, the second input comprises subject details, test questions and 10 relevant answers keys, in case the user profile may correspond to teacher. Further, receive a third input from the user via the user interface. Further, the third input correspond to attempt test, along with locking of answers. Further, convert and upload scanned images of an attempted answers into a digital text using a computer vision, wherein the computer vision may correspond to MathPix API. Further, 15 analyse the digital text using a trained machine learning/artificial intelligence (AI/ML) model to evaluate the attempted answers based on the answer keys using predefined rubrics. Further, assign scores based on the evaluated answers to generate report and feedback for students.
[0015] According to an aspect, the present embodiments a method for evaluating a 20 descriptive answer sheet based on AI/ML model is disclosed. Further, the method comprises receiving, via at least one processor, a first input from a user through a user interface installed within the computing unit. Further, the method comprises analysing and comparing, via the at least one processor input to a user database to determine a teacher profile or a student profile. Further, the method comprises 25 receiving, via the at least one processor, a second input from the user via the user interface. Further, the second input comprises subject details, test questions and relevant answers keys, in case the user profile may correspond to teacher. Further, the method comprises receiving, via the at least one processor, a third input from the user via the user interface. Further, the third input correspond to attempt test, 30
7
along with locking of answers. Further, the method comprises converting and uploading, via the at least one processor, scanned images of an attempted answers into a digital text using a computer vision. Further, the method comprises analysing, via the at least one processor, the digital text using a trained machine learning/artificial intelligence (AI/ML) model to evaluate the attempted answers 5 based on the answer keys using predefined rubrics. Further, the method comprises assigning, via the at least one processor, scores based on the evaluated answers to generate report and feedback for students.
8
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The accompanying drawings illustrate various embodiments of systems, methods, and embodiments of various other aspects of the invention. Any person with ordinary skills in the art will appreciate that the illustrated element boundaries 5 (e.g. boxes, groups of boxes, or other shapes) in the figures represent one example of the boundaries. It may be that in some examples one element may be designed as multiple elements or that multiple elements may be designed as one element. In some examples, an element shown as an internal component of one element may be implemented as an external component in another, and vice versa. Furthermore, 10 elements may not be drawn to scale. Non-limiting and non-exhaustive descriptions are described with reference to the following drawings. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating principles.
[0017] FIG. 1 illustrates a block diagram of a descriptive answer sheet evaluation 15 system based on AI/ML, according to an embodiment of the present invention;
[0018] FIG 2 illustrates a flowchart of the system, according to an embodiment of the present invention;
[0019] FIG. 3 illustrates a detailed flowchart of the system, according to an embodiment of the present invention; 20
[0020] FIG. 4 illustrates an architecture diagram of the system, according to an embodiment of the present invention; and
[0021] FIG. 5 illustrates a flowchart of a method for operating descriptive answer sheet evaluation system based on AI/ML, according to an embodiment of the present invention. 25
9
DETAILED DESCRIPTION
[0023] Some embodiments of this invention, illustrating all its features, will now be discussed in detail. The words “comprising,” “having,” “containing,” and “including,” and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant 5 to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
[0024] Although any systems and methods similar or equivalent to those described 10 herein can be used in the practice or testing of embodiments of the present invention, the preferred, systems and methods are now described. Embodiments of the present invention will be described more fully hereinafter with reference to the accompanying drawings in which like numerals represent like elements throughout the several figures, and in which example embodiments are shown. Embodiments 15 of the claims may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. The examples set forth herein are non-limiting examples and are merely examples among other possible examples.
[0025] The present invention discloses about a descriptive answer sheet evaluation 20 system based on AI/ML that is capable of evaluating a descriptive answer keys by using an AI/ML mode for assigning scores and generating feedback based on the evaluation.
[0026] FIG. 1 illustrates a block diagram of a descriptive answer sheet evaluation system (100) based on AI/ML, according to an embodiment of the present invention. 25
[0027] In some embodiments, the system (100) comprises a computing unit (102) installed with a user interface (104). Further, the system (100) comprises at least one processor (106), a memory (108), an AI/ML model (110), an Input/output circuitry (112), a communication circuitry (114), and a computing device (116).
[0028] In one embodiment, a user interface (104) installed within a computing unit 30 (102). The computing unit (102) may include but not limited to a mobile phone, a
10
tablet or like. The computing unit (102) may be accessed by a user to perform one or more operations. Further, the one or more operations may comprise at least one of providing a medium to input data/information, communicating with one or more other external devices, an image display, and providing various outputs.
[0029] Further, the system (100) comprises the user interface (104). Further, the 5 user interface (104) may be installed within the computing unit (102). Further, the user interface (104) may correspond to at least one of a graphical user interface (GUI), a command-line interface (CLI), application programming interface (API). In at least one example, the user interface (104) may comprise a homepage, one or more redirection pages and one or more tabs linked with the homepage. 10
[0030] Further, the user interface (104) may comprise a homepage that is accessed by the user to enter login credentials. Further, the user interface (104) may facilitate a creation or registration of accounts, login processes, and updates of personal information. Further, the at least one processor (106) configured to fetch and compare the login credentials with a plurality of user profiles stored in a database 15 to verify authenticity of the user.
[0031] In one embodiment, the at least one processor (106) may be communicatively coupled to the memory (108). The at least one processor (106) may include suitable logic, input/ output circuitry, and communication circuitry (114) that are operable to execute one or more instructions stored in the memory 20 (108) to perform predetermined operations. In one embodiment, the at least one processor (106) may be configured to decode and execute any instructions received from one or more other electronic devices or server(s). The at least one processor (106) may be configured to execute one or more computer-readable program instructions, such as program instructions to carry out any of the functions described 25 in this description. Further, the at least one processor (106) may be implemented using one or more processor technologies known in the art. Examples of the at least one processor (106) include, but are not limited to, one or more general purpose processors and/or one or more special purpose processors.
[0032] In one embodiment, the memory (108) may be configured to store a set of 30 instructions and data executed by the at least one processor (106). Further, the
11
memory (108) may include the one or more instructions that are executable by the at least one processor (106) perform specific operations.
[0033] In some embodiments, the database is a structured collection of organized data stored electronically that is accessed and managed through a database management system (DBMS). Further, the at least one processor (106) is 5 configured to compare the login credentials with the plurality of user profiles pre-fed in the database to ensure the accuracy and legitimacy of the login credentials provides by the user, thereby enhancing security and mitigating the risk of fraudulent activities.
[0034] In some embodiments, the at least one processor (106) may be configured to 10 analyse and compare the input from the user to the database to determine whether the user correspond to a teacher profile or a student profile. In an example embodiment, the teacher profile may comprise professional credentials, teaching experience, subject expertise, and possibly references to instructional responsibilities or educational leadership roles. In an example embodiment, the 15 student profile may comprise details like academic level, enrolled courses, grades, and perhaps participation in extracurricular activities. Further, the at least one processor (106) may be configured to compare the data stored in the user databased for identifying patterns and characteristics for distinguishing the teacher profile and the student profile. 20
[0035] In some embodiments, the at least one processor (106) may be configured to receive a second input from the user via the user interface (104). Further, the second input may comprise subject details, test questions and relevant answers keys, in case the user profile may correspond to teacher. Further, the second input may comprise of subject outlines, detailed test questions, and answer keys, which confirms that the 25 user is the teacher, as these elements are typically prepared by educators for instructional and assessment purposes. Further, the teachers are responsible for designing course content, creating assessments, and providing correct answers for grading purposes.
[0036] In some embodiments, the at least one processor (106) may be configured to 30 receive a third input from the user via the user interface (104). Further, the third
12
input is provided by the user in response to the second input. Further, the user that provides the third input may correspond to the student. Further, the third input may comprise attempt text, along with locking of attempted answers. Further, the student interacts with tests by answering questions and locking in their responses for evaluation. 5
[0037] Further, the at least one processor (106) may be configured to fetch the third input received from the user via the user interface (104). Further, the at least one processor (106) may be configured to convert and upload scanned images of the attempted answers into a digital text using a computer vision. Further, a process involves a capturing high-resolution images of submitted answer sheets and 10 employing optical character recognition (OCR) algorithms to accurately translate these images into editable and searchable digital text.
[0038] In some embodiments, the at least one processor (106) may be configured to analyse the digital text using a trained AI/ML model (110) to evaluate the attempted answers based on the answer keys using predefined rubrics. Further, the at least one 15 processor (106) may utilize artificial intelligence and machine learning algorithms to interpret and assess the digitized responses. Further, the at least one processor (106) may be configured to compare the digitized responses established answer keys and grading criteria. Further, the AI/ML model (110) may be trained on large datasets of previous assessments to accurately understand context, relevance, and 20 correctness, ensuring a fair and consistent evaluation. Further, by utilizing the predefined rubrics, the at least one processor (106) may assign score of each response, providing detailed feedback on various aspect. Further, the detailed feedback may be stored in the database.
[0039] It may be noted that the input/output circuitry (112) may act as a medium to 25 transmit input from the communication device to and from the system (100). In some embodiments, the Input/output circuitry (112) may refer to the hardware and software components that facilitate the exchange of information between the user interface (104) and the system (100). The Input/output circuitry (112) may include various input devices such as keyboards, barcode scanners, GUI for the user to 30
13
provide data and various output devices such as displays, printers for the user to receive data.
[0040] For example, the communication device may include N number of user devices. In some embodiments, the communication device may include a graphical user interface (104) (GUI) as input circuitry to allow the user to input data/ or 5 received data. In some embodiments, the communication device may comprise at least one of one or more mobile phones, laptops, or like.
[0041] In one embodiment, the communication circuitry (114) may allow the system (100) and the communication device to exchange data or information with other system (100) or apparatuses. Further, the system (100) may be 10 communicatively coupled with a network interface via one or more protocols and software modules for sending and receiving data or information. For example, the communication circuitry (114) may include Ethernet ports, Wi-Fi adapters, or communication protocols for connecting the system (100) with the computing device (116). 15
[0042] FIG 2 illustrates a flowchart (200) of the system, in accordance to an embodiment of the present invention. FIG. 3 illustrates a detailed flowchart (300) of the system, in accordance to an embodiment of the present invention. FIGS. 2-3 are described in conjunction with FIG.1.
[0043] At operation 202, the at least one processor (106) may receive the first input 20 from a user through the user interface (104) installed within the computing unit (102). Further, the first input may comprise the login credentials of the user. Further, the user may access the user interface (104). Further, the user interface (104) may facilitate the user to enter the login credentials. Further, the login credentials may comprise a username and password. 25
[0044] At operation 204, the at least one processor (106) may analyse and compare the first input to the user database to determine the teacher profile or the student profile. Further, the at least one processor (106) analyze and compare the first input to a stored data in the user database. Further, the at least one processor (106) may analyze and compare associated roles, permissions, previous interactions, and 30 specific identifiers linked to each profile. Further, the at least one processor (106)
14
may accurately classify the user. In an example embodiment, if the login credentials match records with administrative rights, curriculum development roles, or instructional duties, the at least one processor (106) identifies the user as the teacher. Further, if the login credentials correspond to records indicating enrollment in courses, submission of assignments, or participation in assessments, the at least one 5 processor (106) may identify the user as a student.
[0045] At operation 206, the at least one processor (106) may receive the second input from the user via the user interface (104). Further, the second input comprises subject details, test questions and relevant answers keys, in case the user profile may correspond to teacher. Further, upon determining that the user is a teacher, the at 10 least one processor (106) may enable the user interface (104) to accept detailed educational content inputs. Further, the second input comprises the subject specifics, comprehensive test questions designed to assess student knowledge, and the corresponding answer keys that may be used for evaluation. Further, the teacher may efficiently upload and manage assessment materials via the user interface 15 (104), facilitating organized and structured test administration. Further, the user may upload the subject details as illustrated in step 302 of FIG. 3, Further, at operation 208, the user may upload a plurality of questions for the test. Further, the plurality of questions is relevant to the uploaded subject details. Further, at operation 210, the user may upload the relevant answer keys corresponding to the questions upload 20 by the user. Further, the at least one processor (106) secure the second input data which is readily accessible for subsequent analysis, grading, and feedback processes, streamlining the overall workflow for educators. Further, at operation 212, the at least one processor may be configured to halt the processing until receiving a student response. 25
[0046] At operation 214, the user interface (104) may facilitate the user to open assigned test, in case the user may correspond to the student. At operation 216, receive the third input from the user via the user interface (104). Further, the third input correspond to attempt test, along with locking of answers, in case the user profile may correspond to the student. Upon determining the user as the student, the 30 at least one processor (106) may be configured to begin the test. Further, the at least
15
one processor (106) may be configured to record the responses to each question in real-time through the user interface (104). At operation 218, upon completion of the test, the student locks their answers, signalling that they have finished the test. Further, the at least one processor (106) may securely capture and stores the final responses, ensuring that no further changes are made. 5
[0047] At operation 220, the at least one processor (106) may utilize the advanced scanning technology. Further, the advanced scanning technology may be configured to capture high-resolution images of the handwritten or printed answer sheets. Further, the optical character recognition (OCR) may be employed to convert the high-resolution images into digital text. Further, the digitized content may be 10 uploaded to the database via the at least one processor (106) for ensuring that the students' responses are accurately preserved and readily accessible for analysis and grading.
[0048] At operation 222, the at least one processor (106) may utilize a MathPix API (304) as illustrated in FIG. 3 to interpret the handwritten text, equations, or formulas 15 into digital format. Further, the conversion process may ensure accuracy in capturing complex mathematical notation, allowing for seamless integration into educational platforms, assessment tools, or digital repositories.
[0049] At operation 224, the at least one processor (106) is configured to pre-process the digital text to eliminate unnecessary characters. Further, the process 20 involves the cleaning of the converted text by removing extraneous symbols, correcting typographical errors, and ensuring consistent formatting. Further, a text normalization process (308) as illustrated in FIG. 3 include converting all text to a uniform case, expanding contractions, and standardizing punctuation and spacing. Further, the text standardization process (310) as illustrated in FIG. 3 may also 25 involve aligning terminology and abbreviations with predefined norms to ensure coherence and uniformity across the document. Further, the step of pre-processing may enhance the accuracy and readability of the digital text, making it more suitable for subsequent analysis, evaluation, and integration into educational databases.
[0050] At operation 226, the at least one processor (106) may analyze the digital 30 text using a trained machine learning/artificial intelligence (AI/ML) model (110) to
16
evaluate the attempted answers based on the answer keys using predefined rubrics. Further, the AI/ML model (110) may correspond to large language model (LLM) (312) as illustrated in FIG. 3. Further, the LLM (312) is trained on vast amounts of data and may accurately interpret and assess the student responses by comparing them against the answer keys by applying the predefined grading criteria. Further, 5 the evaluation may allow the precise and consistent evaluation, providing objective scoring and detailed feedback, thus enhancing the reliability and efficiency of the grading process.
[0051] At operation 228, the at least one processor (106) may assign scores based on the evaluated answers. At operation 230, the at least one processor (106) may be 10 configured to generate report and feedback for students. Further, the at least one processor (106) may be configured to assign scores based on the evaluated answers to generate reports and feedback for students by performing step marking (314) as illustrated in FIG. 3. Further, the step marking (314) involves breaking down each question into specific components or steps, assigning partial credit for partially 15 correct answers, and providing detailed insights into each step's correctness. Further, the method (200) ensures a comprehensive evaluation of the students' understanding and approach to the problem. Further, the at least one processor (108) may generate reports of detailed marks distribution that highlights strength and areas for improvement, offering personalized feedback to help students learn and progress. 20 Further, the reports of detailed marks distribution are viewed (316) and downloaded (318) by the student as illustrated in FIG. 3.
[0052] FIG. 4 illustrates an architecture diagram (400) of the system (100), according to an embodiment of the present invention.
[0053] In some embodiments, the system (100) may be connected to one or more 25 other subsystems. Further, the one or more other subsystems may comprise user interface (104), Backend-server (402), external services (406) and a database (408). Further, the user interface (104) may comprise teacher tablet interface and student tablet interface. Further, the backend server (402) may comprise an authentication module, subject and test management module, answer sheet upload module, text 30 conversion module, text processing module, evaluation module, and a scoring
17
module. Further, the external services (406) may comprise Mathpix API (304) and Large Language Model (LLM) (312) API. Further, the database (408) may comprise user database, answer sheet database, and Evaluation results database.
[0054] In some embodiments, the user interface (104) installed with the computing unit (102). Further, the user interface (104) may comprise a teacher tablet interface 5 and a student tablet interface. Further, the teacher tablet interface may be accessed by the teacher and the student tablet interface may be accessed by the student. Further, both the user interface (104) may facilitate the users to enter login credentials. Further, the login credentials analyse and compare the at least one processor (106) by using an authentication module. Further, the at least one 10 processor (106) receives and verifies the provided login credentials against stored data pre-fed with the user database. The authentication module employs encryption and other security measures to protect sensitive information such as usernames and passwords. Further, the at least one processor (106) may be configured to check the validity of the login credentials, ensuring that only authorized users gain access of 15 the user interface (104).
[0055] In some embodiments, upon successful logging of the user, the at least one processor (106) may be configured to receive the first input that may comprises subject details, test question and answers key, in case the user may correspond to teacher. Further, the first input may be managed by a subject and test management 20 module. Further, the subject and test management module may facilitate organized storage, retrieval, and modification of the first input, ensuring efficient preparation and administration of educational assessments.
[0056] In some embodiments, in case the user may correspond to the student, the at least one processor (106) may be configured to receive the third input which 25 comprises answering of question as well as locking of the answered questions. Further, the at least one processor (106) may be configured to upload answer sheets using an answer sheet uploading module. Further, the answer sheet uploading module may facilitate the efficient transfer of scanned or digital answer sheets into the system (100) for evaluation and processing. The at least one processor (106) 30
18
ensures that uploaded answer sheets are securely stored in an answer sheets database.
[0057] In some embodiments, the at least one processor (106) may be configured to convert the uploaded written texts of the answer sheet to the digital text by using a text conversion module. Further, the text conversion module may correspond to 5 Mathpix API (304). Further, the text conversion module employs optical character recognition (OCR) technology to accurately scan and interpret the content from scanned images or digital files of answer sheets. Further, the at least one processor (106) facilitates easier storage, retrieval, and analysis of student responses by converting the written texts into editable and searchable digital format. 10
[0058] Further, the at least one processor (106) may be configured to pre-process the digital text by using a text pre-processing module. Further, text pre-processing module is designed to refine and enhance the quality of the digitized text before further analysis or evaluation. Further, the text pre-processing involves various steps such as removing unnecessary characters, correcting typographical errors, 15 standardizing formatting, and normalizing text to ensure consistency and accuracy.
[0059] Further, the at least one processor (106) may be configured to analyse the digital text using an evaluation model. Further, the evaluation model may comprise AI/ ML model (110) which correspond to LLM (312) to evaluate the attempted answers based on the answer keys. Further, the at least one processor (106) may 20 utilize the capabilities of the AI/ML model (110) to interpret and compare the digital student responses against the correct answers provided in the answer keys. Further, by applying advanced natural language processing techniques and semantic understanding, the evaluation model (110) may accurately score and provide feedback on the correctness, completeness, and quality of the answers. Further, the 25 feedback and scores are stored in an evaluation results database.
[0060] FIG. 5 illustrates a flowchart of a method for operating descriptive answer sheet evaluation system (100) based on AI/ML, according to an embodiment of the present invention.
[0061] At operation 502, the at least one processor (106) may receive a first input 30 from a user through a user interface (104) installed within the computing unit (102).
19
Further, the first input may comprise the login credentials of the user. Further, the user interface (104) may facilitate the user to enter the login credentials. Further, the login credentials may comprise a username and password.
[0062] At operation 504, the at least one processor (106) may analyze and compare input to a user database to determine a teacher profile or a student profile. Further, 5 the at least one processor (106) analyze and compare the first input to a stored data in the user database. Further, the at least one processor (106) may analyze and compare associated roles, permissions, previous interactions, and specific identifiers linked to each profile. Further, the at least one processor (106) may accurately classify the user. 10
[0063] At operation 506, the at least one processor (106) may receive a second input from the user via the user interface (104), wherein the second input comprises subject details, test questions and relevant answers keys, in case the user profile may correspond to the teacher. Further, the teacher may efficiently upload and manage assessment materials via the user interface (104), facilitating organized and 15 structured test administration. Further, the at least one processor (106) secure the second input data which is readily accessible for subsequent analysis, grading, and feedback processes, streamlining the overall workflow for educators.
[0064] At operation 508, the at least one processor (106) may receive a third input from the user via the user interface (104), wherein the third input correspond to 20 attempt test, along with locking of answers, in case the user profile may correspond to the student. Further, the at least one processor (106) may be configured to record the responses to each question in real-time through the user interface (104).
[0065] At operation 510, the at least one processor (106) may convert and upload scanned images of an attempted answers into a digital text using a computer vision. 25 Further, the computer vision may correspond to a MathPix API (304). Further, the advanced scanning technology may be configured to capture high-resolution images of the handwritten or printed answer sheets. Further, the optical character recognition (OCR) may be employed to convert the high-resolution images into digital text. Further, the digitized content may be uploaded to the database via the 30 at least one processor (106) for ensuring that the students' responses.
20
[0066] At operation 512, the at least one processor (106) may analyse the digital text using a trained machine learning/artificial intelligence (AI/ML) model (110) to evaluate the attempted answers based on the answer keys using predefined rubrics. Further, the LLM (312) is trained on vast amounts of data and may accurately interpret and assess the student responses by comparing them against the answer 5 keys by applying the predefined grading criteria.
[0067] At operation 514, the at least one processor (106) may assign scores based on the evaluated answers to generate report and feedback for students. Further, the at least one processor (106) may be configured to assign scores based on the evaluated answers to generate reports and feedback for students by performing step 10 marking (310). Further, the at least one may generate detailed reports that highlight strengths and areas for improvement, offering personalized feedback to help students learn and progress.
[0068] It has thus been seen that the descriptive answer sheet evaluation system (100) based on AI/ML, as described. The descriptive answer sheet evaluation 15 system (100) based on AI/ML in any case could undergo numerous modifications and variants, all of which are covered by the same innovative concept; moreover, all of the details can be replaced by technically equivalent elements. In practice, the components used, as well as the numbers, shapes, and sizes of the components can be whatever according to the technical requirements. The scope of protection of the 20 invention is therefore defined by the attached claims.
Dated this 5th Day of August, 2024 Ishita Rustagi (IN-PA/4097) 25 Agent for Applicant , C , C , Claims:CLAIMS We Claim:
1. A descriptive answer sheet evaluation system based on artificial intelligence/machine learning (AI/ML), the system (100) comprises:
a memory (108); 5
at least one processor (106) may be communicatively coupled with the memory (108), wherein the at least one processor (106) is configured to:
receive a first input from a user through a user interface (104) installed within the computing unit (102),
analyse and compare the first input to a user database to determine a teacher 10 profile or a student profile,
receive a second input from the user via the user interface (104), wherein the second input comprises subject details, test questions and relevant answers keys, in case the user profile may correspond to teacher,
receive a third input from the user via the user interface (104), the third input 15 correspond to attempt test, along with locking of answers,
convert and upload scanned images of an attempted answers into a digital text using a computer vision,
analyse the digital text using a trained machine learning/artificial intelligence (AI/ML) model (110) to evaluate the attempted answers based on 20 the answer keys using predefined rubrics, and
assign scores based on the evaluated answers to generate report and feedback for students.
2. The system (100) as claimed in claim 1, wherein the first input comprises a 25 login credentials.
3. The system (100) as claimed in claim 1, wherein the at least one processor (106) is configured to pre-process the digital text to eliminate unnecessary characters. 30
22
4. The system (100) as claimed in claim 1, wherein the at least one processor (106) is configured to standardize and normalize the digital text to breakdown a data into individual tokens.
5. The system (100) as claimed in claim 1, wherein the computer vision may 5 correspond to a MathPix API.
6. The system (100) as claimed in claim 1, wherein the memory (108) further comprises a plurality of security protocols that are configured to secure the input data entered by the user. 10
7. The system (100) as claimed in claim 1, wherein the at least one processor (106) may be configured to store feedback and reports of the user in the database that is further accessed by the user.
15
8. A method (200) comprising:
receiving, via at least one processor (106), a first input from a user through a user interface (104) installed within the computing unit (102);
analysing and comparing, via the at least one processor (106) input to a user database to determine a teacher profile or a student profile; 20
receiving, via the at least one processor (106), a second input from the user via the user interface (104), wherein the second input comprises subject details, test questions and relevant answers keys, in case the user profile may correspond to the teacher;
receiving, via the at least one processor (106), a third input from the user 25 via the user interface (104), wherein the third input correspond to attempt test, along with locking of answers, in case the user profile may correspond to the student;
converting and uploading, via the at least one processor (106), scanned images of an attempted answers into a digital text using a computer vision; 30
23
analysing, via the at least one processor (106), the digital text using a trained machine learning/artificial intelligence (AI/ML) model (110) to evaluate the attempted answers based on the answer keys using predefined rubrics; and
assigning, via the at least one processor (106), scores based on the evaluated answers to generate report and feedback for students. 5
Dated this 5th Day of August, 2024 Ishita Rustagi (IN-PA/4097) Agent for Applicant
| # | Name | Date |
|---|---|---|
| 1 | 202411061446-STATEMENT OF UNDERTAKING (FORM 3) [13-08-2024(online)].pdf | 2024-08-13 |
| 2 | 202411061446-PROOF OF RIGHT [13-08-2024(online)].pdf | 2024-08-13 |
| 3 | 202411061446-POWER OF AUTHORITY [13-08-2024(online)].pdf | 2024-08-13 |
| 4 | 202411061446-FORM 1 [13-08-2024(online)].pdf | 2024-08-13 |
| 5 | 202411061446-FIGURE OF ABSTRACT [13-08-2024(online)].pdf | 2024-08-13 |
| 6 | 202411061446-DRAWINGS [13-08-2024(online)].pdf | 2024-08-13 |
| 7 | 202411061446-DECLARATION OF INVENTORSHIP (FORM 5) [13-08-2024(online)].pdf | 2024-08-13 |
| 8 | 202411061446-COMPLETE SPECIFICATION [13-08-2024(online)].pdf | 2024-08-13 |
| 9 | 202411061446-FORM-8 [16-08-2024(online)].pdf | 2024-08-16 |
| 10 | 202411061446-FORM-9 [20-08-2024(online)].pdf | 2024-08-20 |
| 11 | 202411061446-FORM 18 [03-10-2024(online)].pdf | 2024-10-03 |
| 12 | 202411061446-FER.pdf | 2025-08-11 |
| 1 | 202411061446_SearchStrategyNew_E_SearchHistoryE_31-07-2025.pdf |