Sign In to Follow Application
View All Documents & Correspondence

System And Method For Detecting Fraudulent Behavior During Examination

Abstract: SYSTEM AND METHOD FOR DETECTING FRAUDULENT BEHAVIOR DURING EXAMINATION ABSTRACT A system (100) for detecting fraudulent behavior during an examination is disclosed. The system (100) includes at least one video capturing device (102) to capture real-time video data of examinees during an exam session, and a processing unit (106). The system (100) is configured to analyze the captured video data using a computer vision model (108) to detect behavioral indicators of potential fraud. The system (100) generates one or more attention heatmaps indicating areas of visual focus or activity for each examinee. The system (100) collects exam response data and conducts statistical analysis to identify anomalies, including answer pattern irregularities, statistical outliers, time-based anomalies, and unexpected similarities. A fraud risk score is computed for each examinee by correlating the behavioral indicators and analyzed exam response data. The system (100) triggers a real-time alert indicating suspected fraudulent activity based on the computed fraud risk score. Claims: 10, Figures: 3 Figure 1 is selected.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 May 2025
Publication Number
23/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR University
SR University, Ananthasagar, Warangal Telangana India 506371 patent@sru.edu.in 08702818333

Inventors

1. Mrs. E. Divya
Ph. D Scholar, SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371
2. Dr. V. Shobha Rani
Assistant Professor (CS&AI), SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371

Specification

Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a fraud detection system and particularly to a system and method for detecting fraudulent behavior during an examination.
Description of Related Art
[002] Examination fraud is a persistent problem in academic and professional testing environments, compromising the fairness and credibility of assessment processes. Conventional approaches to monitoring exams typically include human invigilation, surveillance cameras, and software-based proctoring tools. While these methods provide some level of oversight, and often fail to detect sophisticated arrangements of cheating, such as subtle behavioral cues, use of hidden communication devices, or collusion among examinees.
[003] Existing solutions are further limited in their ability to process and analyze behavior in real time or provide clear, interpretable indicators of potential misconduct. Many systems are either overly invasive or generate excessive false positives, leading to a poor testing experience and inefficient fraud detection. Furthermore, traditional techniques rarely correlate behavioral data with response pattern anomalies that reduces an effectiveness in identifying coordinated or intelligent cheating strategies.
[004] There is thus a need for an improved and advanced fraud detection system that can address the aforementioned limitations in a more efficient, more accurate, non-intrusive, and data-driven manner.
SUMMARY
[005] Embodiments in accordance with the present invention provide a system and method for detecting fraudulent behavior during an examination. The system comprises a video capturing device adapted to capture real-time video data of examinees during an exam session and a processing unit connected to the video capturing device. The processing unit is configured to analyze the captured video data using a computer vision model to detect behavioral indicators of potential fraud. The behavioral indicators may comprise an abnormal gaze direction, an unusual head orientation, a frequent or erratic body movement, an interaction with unauthorized objects, and a combination thereof. The processing unit is further configured to generate, using a machine learning-based attention mechanism, one or more attention heatmaps that indicate areas of visual focus or activity associated with each examinee during the examination. The system also collects exam response data from the examinees and conducts statistical analysis on the collected data. This analysis includes detecting anomalies in answer patterns, statistical outliers, time-based irregularities, unexpected similarities between responses, and so forth. By correlating the behavioral indicators with the statistical patterns in the exam response data, the processing unit computes a fraud risk score for each examinee. This score is then compared to a threshold score. When the computed fraud risk score exceeds the threshold, the system triggers a real-time alert for potential fraud detection.
[006] Embodiments in accordance with the present invention further provide a method for detecting fraudulent behavior during an examination. The method comprises the steps of capturing real-time video data of examinees using a video capturing device, analyzing the video data using a computer vision model to detect behavioral indicators of potential fraud such as abnormal gaze direction, unusual movement patterns, or interactions with unauthorized objects. The method further includes generating one or more attention heatmaps using a machine learning-based attention mechanism to indicate areas of visual focus or activity associated with the examinees. The method proceeds by collecting exam response data from the examinees and conducting statistical analysis to detect anomalies including irregular answer patterns, statistical outliers, timing anomalies, or similarities between answers of different examinees. A fraud risk score is computed by correlating the behavioral indicators with the analyzed exam response data. The computed fraud risk score is compared to a defined threshold, and if the score exceeds this threshold, a real-time alert is triggered to notify exam administrators of potential fraud.
[007] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a system and method for detecting fraudulent behavior during an examination that enables real-time behavioral monitoring using artificial intelligence-powered visual analysis.
[008] Next, embodiments of the present application may provide a system that integrates visual behavioral data with exam response pattern analysis for improved detection of fraudulent activity.
[009] Next, embodiments of the present application may provide a system that uses attention heatmaps to visually highlight suspicious activity zones during an examination.
[0010] Next, embodiments of the present application may provide a system that reduces false positives by correlating multiple data sources and leveraging advanced machine learning algorithms for better decision-making.
[0011] Next, embodiments of the present application may provide a scalable and non-intrusive solution suitable for both remote and in-person examinations.
[0012] Next, embodiments of the present application may provide an automated tool that can continuously learn and improve its detection accuracy over time through retraining on labeled data.
[0013] These and other advantages will be apparent from the present application of the embodiments described herein.
[0014] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor an exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0016] FIG. 1 depicts a block diagram of a system for detecting fraudulent behavior during an examination, according to an embodiment of the present invention;
[0017] FIG. 2 illustrates components of a processing unit for the system for detecting fraudulent behavior during the examination, according to an embodiment of the present invention; and
[0018] FIG. 3 illustrates a flowchart for a method for detecting fraudulent behavior during the examination, according to an embodiment of the present invention.
[0019] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0020] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0021] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0022] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0023] FIG. 1 depicts a block diagram of a system 100 for detecting fraudulent behavior during an examination, according to an embodiment of the present invention. The system 100 may be adapted to monitor examinees in real-time during an examination session to detect behavioral indicators of potential fraud. The system 100 may further be adapted to analyze captured video data using artificial intelligence-based models and generate attention heatmaps to localize areas of interest or unusual activity. The system 100 may be adapted to correlate behavioral indicators with statistical anomalies in exam response data to compute a fraud risk score and trigger real-time alerts when necessary.
[0024] According to the embodiments of the present invention, the system 100 may incorporate non-limiting hardware components to enhance the processing speed and efficiency such as the system 100 may comprise a video capturing device 102, a visual interface 104, a processing unit 106, a computer vision model 108, a database 110, computing devices 112, a user device 114, and a communication network 116. The hardware components may work in coordination to perform visual monitoring, data collection, real-time analysis, and fraud risk scoring. In an embodiment of the present invention, the hardware components of the system 100 may be integrated with computer-executable instructions for overcoming the challenges and limitations of the existing systems. These instructions may be stored in memory modules and executed by processors to facilitate continuous fraud detection and decision-making without human intervention.
[0025] In an embodiment of the present invention, the video capturing device 102 may be adapted to continuously capture high-resolution real-time video data of the examinees. The video capturing device 102 may be adapted to capture the video data of the examinees from various angles during the examination session. In an embodiment of the present invention, the video capturing device 102 may be arranged over and/or in proximity to seats of the examinee. In another embodiment of the present invention, the video capturing device 102 may be coupled with Closed-Circuit Television (CCTV) cameras of an examination center.
[0026] In yet another embodiment of the present invention, the video capturing device 102 may be embedded within monitors, display units or writing tables used by the examinees to ensure unobtrusive monitoring. In a further embodiment of the present invention, the video capturing device 102 may be mounted on drones (not shown) or movable stands (not shown) to provide dynamic and adjustable surveillance coverage during the examination. In another embodiment of the present invention, the video capturing device 102 may include infrared or low-light capabilities to function effectively under varying lighting conditions.
[0027] The video capturing device 102 may include features that may be, but not limited to, an autofocus, a motion detection, infrared support for low-light environments, a frame synchronization, and so forth, to ensure accurate monitoring. Embodiments of the present invention are intended to include or otherwise cover any suitable features of the video capturing device 102, including known, related art, and/or later developed technologies. The captured video data may be transmitted to the processing unit 106 for further analysis.
[0028] In an embodiment of the present invention, the visual interface 104 may be adapted to display the generated attention heatmaps, statistical summaries, and alert notifications. The visual interface 104 may provide exam proctors or administrators with annotated video segments highlighting suspicious behavior along with timestamps. The visual interface 104 may further include user interaction capabilities for reviewing flagged incidents, filtering by risk score, and exporting reports for post-exam analysis.
[0029] In an embodiment of the present invention, the processing unit 106 may be adapted to receive the captured video data from the video capturing device 102. The processing unit 106 may further be configured to execute computer-executable instructions to generate an output relating to the system 100. Upon execution of the computer-executable instructions, the processing unit 106 may be configured to transmit the output relating to the system 100 to the visual interface 104.
[0030] In an embodiment of the present invention, the processing unit 106 may be adapted to process the captured video data and the exam response data in real-time using artificial intelligence algorithms. The processing unit 106 may include programming modules (as shown in FIG. 2) for executing instructions related to the output of the system 100, such as running convolutional neural networks, generating the attention heatmaps, performing statistical analysis, computing fraud risk scores, managing thresholds for alerts, and so forth. Embodiments of the present invention are intended to include or otherwise cover any suitable output of the processing unit 106, including known, related art, and/or later developed technologies.
[0031] The processing unit 106 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the processing unit 106, including known, related art, and/or later developed technologies. In an embodiment of the present invention, the processing unit 106 may further be explained in conjunction with FIG. 2.
[0032] In an embodiment of the present invention, the processing unit 106 may be located on a cloud server (not shown). In an exemplary embodiment of the present invention, the cloud server may be a public cloud server. In another exemplary embodiment of the present invention, the cloud server may be a private cloud server. In yet another embodiment of the present invention, the cloud server may be a dedicated cloud server. According to embodiments of the present invention, the cloud server may be, but not limited to, a Microsoft Azure cloud server, an Amazon AWS cloud server, a Google Compute Engine (GCE) cloud server, an Amazon Elastic Compute Cloud (EC2) cloud server, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the cloud server, including known, related art, and/or later developed technologies.
[0033] In an embodiment of the present invention, the processing unit 106 may employ the computer vision model 108. In an embodiment of the present invention, the computer vision model 108 may be a convolutional neural network (CNN) that may be trained to detect a head orientation, an eye gaze direction, a hand movement of the examinees, and so forth. The computer vision model 108 may be adapted to analyze video frames and detect behavioral indicators of fraudulent activity.
[0034] The behavioral indicators may comprise an abnormal gaze direction, an unusual head orientation, a frequent or erratic body movement, an interaction with unauthorized objects, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the behavioral indicators, including known, related art, and/or later developed technologies.
[0035] The computer vision model 108 may be trained using labeled datasets comprising examples of normal and suspicious behaviors. The computer vision model 108 may further include a self-attention mechanism that may enable the system 100 to focus on temporally relevant segments of the captured video to generate the attention heatmaps for the examinee.
[0036] In an embodiment of the present invention, the database 110 may be adapted to store the generated attention heatmaps for the examinees. The database 110 may store associated metadata, including time-stamped behavioral event logs, computed fraud risk scores, examinee identification data, exam session identifiers, statistical response analyses, raw and processed video data, and alert history. The database 110 may also be configured to store training datasets for machine learning models, configuration files for parameters of the system 100, and user access logs for audit and compliance purposes.
[0037] The database 110 may be, for example but not limited to, a distributed database, a personal database, an end-user database, a commercial database, a Structured Query Language (SQL) database, a non-Structured Query Language (NoSQL) database, an operational database, a relational database, an object-oriented database, a graph database, and so forth. In a preferred embodiment of the present invention, the database 110 may be a cloud database. Embodiments of the present invention are intended to include or otherwise cover any type of the database 110, including known, related art, and/or later developed technologies. Further, the database 110 may be stored in the cloud server, in an embodiment of the present invention, to enable scalable storage, remote access, real-time synchronization across modules, and high-availability failover support.
[0038] The computing devices 112 may be hidden devices that may be arranged over and/or in proximity to the seats of the examinees, in an embodiment of the present invention. In another embodiment of the present invention, the computing devices 112 may be integrated with the monitors, the display units and/or the writing tables of the examinees that the examinees may be using for giving the examination. In an embodiment of the present invention, the exam response data may be collected from the computer devices 112 of each of the examinees in real-time. The computer devices 112 may be synchronized with the system 100 and may be configured to log keystrokes, timestamps, interaction events, and user responses during the examination session. This data may be automatically transmitted to the processing unit 106 for further analysis.
[0039] In another embodiment of the present invention, the exam response data may be collected from physical copies of answer sheets that may be scanned using an optical scanner or similar input device. The scanned documents may then be processed using optical character recognition (OCR) and natural language processing (NLP) techniques to extract relevant answer information. The extracted data may subsequently be normalized, digitized, and stored in the database 110 for the statistical analysis and correlation with behavioral indicators. The system 100 may be configured to handle both digital and physical formats of the exam response data to provide flexibility in deployment across various examination environments.
[0040] In an embodiment of the present invention, the user device 114 may be configured to receive the generated alert in real time via the communication unit 116. The user device 114 may be, for example, but not limited to, a proctor’s dashboard, an administrative workstation, a mobile device, a tablet, or any other network-connected electronic device capable of rendering alerts. The user device 114 may further be configured to display, log, or act upon the received alert depending on configurations and/or administrative privileges of the system 100. Additionally, the user device 114 may be configured to access and review the associated behavioral heatmaps, the exam response anomalies, and metadata transmitted with the alert to aid in timely decision-making and appropriate intervention.
[0041] In an embodiment of the present invention, the communication unit 116 may be configured to facilitate secure transmission of the generated alerts and supporting data from the system 100 to the user device 114. The communication unit 116 may include network interfaces, secure communication protocols, and message handling mechanisms to ensure data integrity, confidentiality, and timely delivery. The communication unit 116 may further be configured to support multiple alert formats, enabling customizable delivery channels such as pop-up notifications, electronic mail, push messages, audio-visual signals, or vibration-based cues. In a preferred embodiment, the communication unit 116 may be integrated with alert prioritization logic to distinguish between critical, moderate, and low-risk fraud alerts for appropriate routing and escalation.
[0042] FIG. 2 illustrates components of the processing unit 106 for the system 100 for detecting the fraudulent behavior during the examination, according to an embodiment of the present invention. The processing unit 106 may comprise computer-executable instructions in the form of programming modules including a data capturing module 200, a behavioral analysis module 202, an attention heatmap generation module 204, an exam response analysis module 206, a fraud risk computation module 208, an alert triggering module 210, and a user interface control module 212.
[0043] In an embodiment of the present invention, the data capturing module 200 may be configured to receive, process, and synchronize streams of the real-time video data from the video capture device 102 and the exam response data entered by examinees during the examination session. The data capturing module 200 may also be configured to maintain temporal alignment between behavioral activity and answer inputs to enable accurate cross-referencing during analysis.
[0044] Upon capturing of the video data and the exam response data, the data capturing module 200 may be configured to generate an analysis signal and may be configured to transmit the generated analysis signal to the behavioral analysis module 202.
[0045] In an embodiment of the present invention, the behavioral analysis module 202 may be configured to process the captured video data using a computer vision model 108 to detect the behavioral indicators of potential fraud. The behavioral indicators may be, but not limited to, an abnormal gaze direction, an unusual head orientation, a frequent or erratic body movement, an interaction with unauthorized objects, and so forth. Embodiments of the present invention are intended to include or otherwise cover any suitable behavioral indicators, including known, related art, and/or later developed technologies. The computer vision model 108 employed by the behavioral analysis module 202 may include a convolutional neural network trained on annotated behavioral datasets to detect such suspicious activities accurately.
[0046] Upon analyzing the captured video data, the behavioral analysis module 202 may be configured to generate a heatmap generation signal and may be configured to transmit the generated heatmap generation signal to the attention heatmap generation module 204.
[0047] The attention heatmap generation module 204 may be configured to be activated upon receiving the generated heatmap generation signal from the behavioral analysis module 202. In an embodiment of the present invention, the attention heatmap generation module 204 may further be configured to generate the attention heatmaps using a machine learning-based attention mechanism, such as a self-attention neural network. The self-attention neural network may be configured to process temporal video sequences to generate dynamic heatmaps.
[0048] The generated heatmaps may indicate areas of visual focus or physical activity associated with each examinee during the examination. The generated dynamic heatmaps may also aid in identifying regions of high activity or concentration by the examinees, which may suggest potential collaboration, distraction, or engagement with unauthorized materials. These regions may include frequent visual attention towards neighboring examinees, repetitive glances outside the screen, or concentrated interaction with specific desk areas. The annotated heatmaps may be stored in the database 110 and may be made accessible via the user interface control module 212 for review of an examiner. This visual representation may enhance the transparency, interpretability, and auditability of the fraud detection process by providing examiners with contextual behavioral evidence.
[0049] The attention heatmap generation module 204 may further be configured to annotate the heatmaps with time-stamped behavioral events, allowing examiners to easily interpret and correlate suspicious events.
[0050] The attention heatmap generation module 204 may be configured to compile the generated heatmaps and forward the complied heatmaps to the user interface control module 212 for visualization and/or to the fraud risk computation module 208 for correlation with response data and behavioral analysis results. The attention heatmap generation module 204 may further be configured to store the generated attention heatmaps in the database 110 along with associated time stamps, examinee identification, and related behavioral metadata. This enables historical review and training of future fraud detection models.
[0051] Upon generation of the heatmaps, the attention heatmap generation module 204 may be configured to generate an exam response analysis signal, and may be configured to transmit the generated exam response analysis signal to the exam response analysis module 206.
[0052] The exam response analysis module 206 may be configured to be activated upon receiving the generated exam response analysis signal from the attention heatmap generation module 204. The exam response analysis module 206 may be further configured to conduct the statistical analysis and/or a pattern-based analysis on the collected exam response data. The statistical analysis of the exam response data may include the stylometric analysis for written answers, answer timing sequence profiling for multiple-choice or interactive questions, and so forth. Embodiments of the present invention are intended to include or otherwise cover any suitable statistical analysis, including known, related art, and/or later developed technologies.
[0053] The exam response analysis module 206 may be configured to detect anomalies such as irregular timing sequences, statistical outliers, patterns suggestive of collusion, stylometric inconsistencies in written responses, or unexpected similarities in multiple-choice answers. The exam response analysis module 206 may be configured to use rule-based analysis as well as machine learning techniques to identify irregularities.
[0054] Upon completing the statistical analysis of the exam response data, the exam response analysis module 206 may be configured to transmit a response analysis signal to the fraud risk computation module 208. This signal may include structured data representing detected anomalies, response timing, similarity indices, and classification probabilities based on learned models.
[0055] The fraud risk computation module 208 may be configured to be activated upon receiving the response analysis signal from the exam response analysis module 206.
[0056] The fraud risk computation module 208 may be configured to compute a fraud risk score for each examinee by correlating data from the behavioral analysis module 202 and the exam response analysis module 206. The fraud risk computation module 208 may be configured to apply weighted scoring algorithms, decision trees, or ensemble learning models trained on labeled datasets to determine the likelihood of fraudulent behavior. This risk score serves as the primary metric for determining whether a student's behavior warrants further review or immediate action.
[0057] Upon generating the fraud risk score for each examinee, the fraud risk computation module 208 may be configured to generate an alert signal, and may transmit the generated alert signal to the alert triggering module 210.
[0058] In an embodiment of the present invention, the alert triggering module 210 may be configured to be activated from receiving the generated alert signal from the fraud risk computation module 208. In an embodiment of the present invention, the alert triggering module 210 may further be configured to compare the computed fraud risk score to a threshold score. When the fraud risk score exceeds this threshold, the alert triggering module 210 may be configured to generate a real-time alert. The alert may include relevant metadata such as examinee identification, annotated heatmaps, behavioral flags, and response anomalies to assist human examiners in decision-making.
[0059] The generated alert may further be transmitted to the user device 114 via the communication unit 116. The alert received on the user device 114 may be in a pre-defined form, in an embodiment of the present invention. The pre-defined form of the alert received on the user device 114 may be, but not limited to a pop-up alert, a flash alert, a ringer alert, a silent notification, a push alert, a hidden alert, an electronic mail alert, a Short Message Service (SMS) alert, an always on-screen alert, a vibrational alert, a haptic alert, an audio-visual alert, a voice-based alert, and so forth. Embodiments of the present invention are intended to include or otherwise cover any pre-defined form of the alert that may be received on the user device 114, including known, related art, and/or later developed technologies.
[0060] In an embodiment of the present invention, the user interface control module 212 may be configured to manage the outputs of the system 100 and/or presentation of the outputs of the system 100 via the visual interface 104. The user interface control module 212 may be configured to overlay the attention heatmaps on the examinee’s video feed, display fraud risk scores, list flagged behavioral events, and provide playback functionality for review. The user interface control module 212 may also be configured to offer options for filtering, exporting, and securely logging data for administrative review and audit compliance.
[0061] FIG. 3 illustrates a flowchart for a method for detecting fraudulent behavior during the examination, according to an embodiment of the present invention.
[0062] At step 302, the system 100 may be configured to capture the real-time video data of examinees using the video capture device 102. The captured video data may include facial expressions, gaze direction, head and body movements, interactions with the surrounding environment during the examination session, and so forth.
[0063] At step 304, the system 100 may be configured to analyze the captured video data using the computer vision model 108 to detect behavioral indicators of potential fraud. The computer vision model 108 may comprise the convolutional neural network trained to recognize and classify these suspicious behaviors.
[0064] At step 306, the system 100 may be configured to generate the attention heatmaps using the machine learning-based attention mechanism. The attention mechanism may process the temporal video sequence of video frames to produce heatmaps indicating regions of visual focus and/or physical activity associated with each examinee. These heatmaps may further be annotated with time-stamped behavioral events.
[0065] At step 308, the system 100 may be configured to collect the exam response data from the examinees. The response data may be obtained either digitally via the computing devices 112 or through physical answer sheets that are scanned and processed using Optical Character Recognition and Natural Language Processing techniques.
[0066] At step 310, the system 100 may be configured to conduct the statistical analysis on the collected exam response data. This analysis may include detecting anomalies such as irregular answer timing, statistical outliers, unexpected response similarities between examinees, and stylometric inconsistencies in written responses.
[0067] At step 312, the system 100 may be configured to compute a fraud risk score for each examinee by correlating the detected behavioral indicators with the results of the exam response analysis. The computation may employ weighted algorithms, decision trees, or ensemble learning models trained on labeled datasets.
[0068] At step 314, the system 100 may be configured to compare the computed fraud risk score for each examinee to the threshold score. If the fraud risk score exceeds the threshold score, the system 100 may proceed to a step 316. Otherwise, the method 300 may return to the step 304.
[0069] At step 316, the system 100 may be configured to trigger the real-time alert. The real-time alert may be transmitted to a user device 114 via the communication unit 116.
[0070] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0071] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. A system (100) for detecting fraudulent behaviour during an examination, the system (100) comprising:
a video capturing device (102) adapted to capture real-time video data of examinees during an exam session; and
a processing unit (106) connected to the video capturing device, characterized in that the processing unit (106) is configured to:
analyze the captured video data using a computer vision model (108) to detect behavioural indicators of potential fraud, wherein the behavioural indicators are selected from an abnormal gaze direction, an unusual head orientation, a frequent or erratic body movement, an interaction with unauthorized objects, and a combination thereof;
generate, using a machine learning-based attention mechanism, one or more attention heatmaps indicating areas of visual focus or activity associated with each of the examinees during the exam period;
collect exam response data from the examinees;
conduct statistical analysis on the collected exam response data, wherein the statistical analysis comprises detecting anomalies in answer patterns, statistical outliers, time-based irregularities, unexpected similarities between responses, or a combination thereof;
compute a fraud risk score for each of the examinees by correlating the detected behavioural indicators with the analyzed exam response data;
compare the computed fraud risk score for each of the examinees to a threshold score; and
trigger a real-time alert when the computed fraud risk score of any of the examinees exceeds the threshold score.
2. The system (100) as claimed in claim 1, wherein the computer vision model (108) is a convolutional neural network (CNN) trained to detect a head orientation, an eye gaze direction, a hand movement of the examinees, or a combination thereof.
3. The system (100) as claimed in claim 1, wherein the processing unit (106) is configured to transmit the real-time alert to a user device (114).
4. The system (100) as claimed in claim 1, wherein the processing unit (106) is further configured to store the attention heatmaps, the behavioural indicators, the exam response data, and the computed fraud risk score in a database (110).
5. The system (100) as claimed in claim 1, wherein the machine learning-based attention mechanism comprises a self-attention neural network configured to process temporal video sequences to generate the attention heatmaps.
6. The system (100) as claimed in claim 1, wherein the generated attention heatmaps are overlaid on a visual interface (104) and annotated with time-stamped behavioural events for review by an examiner.
7. The system (100) as claimed in claim 1, wherein the statistical analysis of the exam response data includes stylometric analysis for written answers, answer timing sequence profiling for multiple-choice or interactive questions, or a combination thereof.
8. The system (100) as claimed in claim 1, wherein the fraud risk score is computed using a weighted scoring algorithm, a decision tree, an ensemble learning model trained on labelled exam behaviour data, or a combination thereof.
9. A method (300) for detecting fraudulent behaviour during an examination, the method comprising:
capturing real-time video data of examinees using a video capturing device (102);
analysing the captured video data using a computer vision model (108) to detect behavioural indicators of potential fraud, wherein the behavioural indicators are selected from an abnormal gaze direction, an unusual head orientation, a frequent or erratic body movement, an interaction with unauthorized objects, and a combination thereof;
generating, using a machine learning-based attention mechanism, one or more attention heatmaps indicating areas of visual focus or activity associated with each of the examinees during the exam period;
collecting exam response data from the examinees;
conducting statistical analysis on the collected exam response data, wherein the statistical analysis comprises detecting anomalies in answer patterns, statistical outliers, time-based irregularities, unexpected similarities between responses, or a combination thereof;
computing a fraud risk score for each of the examinees by correlating the detected behavioural indicators with the analyzed exam response data;
comparing the computed fraud risk score for each of the examinees to a threshold score; and
triggering a real-time alert when the computed fraud risk score of any of the examinees exceeds the threshold score.
10. The method as claimed in claim 9, comprising a step of transmitting the real-time alert to a user device (114).
Date: May 15, 2025
Place: Noida

Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant

Documents

Application Documents

# Name Date
1 202541047942-STATEMENT OF UNDERTAKING (FORM 3) [19-05-2025(online)].pdf 2025-05-19
2 202541047942-REQUEST FOR EARLY PUBLICATION(FORM-9) [19-05-2025(online)].pdf 2025-05-19
3 202541047942-POWER OF AUTHORITY [19-05-2025(online)].pdf 2025-05-19
4 202541047942-OTHERS [19-05-2025(online)].pdf 2025-05-19
5 202541047942-FORM-9 [19-05-2025(online)].pdf 2025-05-19
6 202541047942-FORM FOR SMALL ENTITY(FORM-28) [19-05-2025(online)].pdf 2025-05-19
7 202541047942-FORM 1 [19-05-2025(online)].pdf 2025-05-19
8 202541047942-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-05-2025(online)].pdf 2025-05-19
9 202541047942-EDUCATIONAL INSTITUTION(S) [19-05-2025(online)].pdf 2025-05-19
10 202541047942-DRAWINGS [19-05-2025(online)].pdf 2025-05-19
11 202541047942-DECLARATION OF INVENTORSHIP (FORM 5) [19-05-2025(online)].pdf 2025-05-19
12 202541047942-COMPLETE SPECIFICATION [19-05-2025(online)].pdf 2025-05-19