Abstract: An assistive tutoring device, comprising a housing 101 with multiple motorized omnidirectional wheels 102 for mobility, a touch enabled display panel 103 present learning material, a speaker 104 for audio output, an artificial intelligence-based imaging unit 105 analyzes user facial expressions to detect learning difficulty, a plate 106 extended by sliding units 107 allows submission of written work for an OCR (optical character recognition) sensor 109 for text extraction, an analysis module identifies errors in the extracted text, a dual axis lead screw arrangement 110 with an articulated L-shaped telescopic arm 111 annotates these errors, a plurality of clippers 112 secure material on the plate 106, a holographic projection unit 113 offers writing guidance, a microphone 114 enables vocal responses, a detection module analyzes the database for probable learning disabilities, a user interface on a computing unit controls sessions, material, and data access.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to an assistive tutoring device that is capable of providing personalized educational support by delivering multimodal learning and feedbacks, adapting to a user's learning pace and potential difficulties, and offering supports for both instruction and practice in various learning domains, including writing and comprehension.
BACKGROUND OF THE INVENTION
[0002] Tutoring offers personalized support, adapting to individual learning styles and paces, which is often lacking in traditional educational settings. Tutoring provides focused attention, clarifies doubts immediately, and reinforces understanding, leading to improved academic performance and increased confidence. The tutoring faces challenges like high costs limiting accessibility, potential student dependence hindering independent learning, and the time commitment required for both tutor and student. Finding the right tutor-student match regarding personality and teaching style is also difficult, and sometimes the expected results not materialize quickly enough, leading to discouragement.
[0003] Traditionally used devices for tutoring include physical textbooks, workbooks, whiteboards, and in-person human tutors. Problems associated with these in the context of automation include the lack of personalized, real-time feedback that adapts dynamically to the student's progress. Human tutors, while offering nuanced understanding, are not scalable or consistently available. Physical materials require manual grading and lack automated progress tracking. Automating these aspects is challenging as traditional methods are inherently manual and do not generate easily quantifiable data for automated analysis and adaptation.
[0004] US5597312A discloses a computer based intelligent method and system for tutoring a student in an interactive application. The method and system include a computer system for selecting a mode for an adjustable teaching parameter, generating a student model, and monitoring a student interactive task based upon the teaching parameter and the student model. The method and system also include a computer system for generating an updated student model based upon a student response to the student interactive task generated, and monitoring a student interactive task based upon the teaching parameter and the updated student model.
[0005] US5283865A discloses a computerized system provides a salesperson with assistance related to training and sales of parts corresponding to particular products. More particularly, a computerized system incorporating a data storage device, a display apparatus, a part selection device and a user interface mechanism enhances the efforts of a parts salesman. The data storage device electronically stores graphic and textual parts-related information including specifications, features and customer benefits. The display apparatus electronically displays portions of the graphic and textual information in order to provide training and sales assistance related to part features and customer benefits. The part selection device electronically selects a particular part by navigating through part choices menus based on stored part specifications. The user interface controls the operation of the display apparatus and the part selection device so that each of the respective system parts are operatively coupled and related to one another.
[0006] Conventionally, many devices have been available in market for offering tutoring services. However, these existing devices lack the capability of providing multimodal learning experiences, dynamically adapting to user learning difficulties through real-time facial expression analysis. In addition, these existing devices also fail in offering automated feedback on written work with physical annotation, and providing mobility for flexible learning environments, all within a single, user-friendly device.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a device that requires to be capable of providing a comprehensive and adaptive tutoring solution that integrates multimodal instruction, real-time user feedback, assessment with physical annotation capabilities, and enhanced portability for a more engaging and effective learning experience.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop a device that is capable of delivering a learning platform dedicated to each user's needs that adapts content and pace based on their individual progress and identified challenges.
[0010] Another object of the present invention is to develop a device that offers immediate feedback on their work and providing helpful guidance without constant external assistance.
[0011] Another object of the present invention is to develop a device that is capable of aiding individuals facing learning challenges by proactively recognizing moments of difficulty and offering customized support to improve understanding and retention.
[0012] Yet, another object of the present invention is to develop a device that is capable of monitoring a user's learning journey, while recording their progress and identifying recurring patterns to continuously improve educational material and how it's presented.
[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0014] The present invention relates to an assistive tutoring device that is capable of delivering personalized, multimodal instruction and feedback, adapting to individual learning needs, identifying potential difficulties, supporting writing practice with autonomous error detection and guidance, and offering mobility for flexible learning environments.
[0015] According to an embodiment of the present invention, an assistive tutoring device, comprising a housing having a plurality of motorised omnidirectional wheels installed underneath the housing for a locomotion of the housing, a touch enabled display panels mounted over the housing for displaying instructions and educational material, a speaker installed on the housing to produce audio associated with the instructions and educational content displayed by the display panel, an artificial intelligence-based imaging unit, installed on the housing and integrated with a processor for recording and processing images in a vicinity of the housing, captures facials expressions of the user, to determine difficulty in learning, to trigger a microcontroller to record onto a connected memory, portions of the study material the difficulty was indicated for to accordingly actuate the display panel and the speaker to repeat the portion, a plate attached within the housing by means of a pair of sliding units, extended outwards from the housing via an opening in the front surface of the housing, to enable a user to place written material onto the plate for submission, an OCR (optical character recognition) sensor installed within the housing, for extracting text from the submitted written material, an analysis module configured with the microcontroller, receives the extracted text from the OCR sensor, to determine errors.
[0016] According to another embodiment of the present invention, the device further comprises of a dual axis lead screw arrangement installed within the housing, with an articulated L-shaped telescopic arm having a writing instrument at an end to annotate the written material based on detected errors for reference of the user, a plurality of clippers arranged over the plate to secure the written material over the plate, a database is linked with the microcontroller to store the errors to determine change in frequency of each type of error committed over time, a holographic projection unit installed over the housing to project images onto the paper being practiced on by the user for training of writing, to provide guidance to the user, a microphone is mounted over the housing to enable the user to provide vocal responses against questions displayed by the display panel, a detection module is configured with the microcontroller to analyse the database determines a probably learning disability suffered by the user, a user interface is adapted to be installed onto a computing unit to enable communication with a communication unit provided in the housing, to initiate and halt tutoring session, select and modify study material and access the database and the results generated by the detection module.
[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of an assistive tutoring device.
DETAILED DESCRIPTION OF THE INVENTION
[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0022] The present invention relates to an assistive tutoring device that provides individualized educational assistance through a combination of interactive display, audio guidance, and analysis of user engagement and performance, further incorporating features for writing practice and mobility to enhance the learning experience.
[0023] Referring to Figure 1, an isometric view of an assistive tutoring device is illustrated, comprising a housing 101 having a plurality of motorised omnidirectional wheels 102 installed underneath the housing 101, a touch enabled display panel 103 mounted over the housing 101, a speaker 104 installed on the housing 101, an artificial intelligence-based imaging unit 105 installed on the housing 101, a plate 106 attached within the housing 101 by means of a pair of sliding units 107.
[0024] Figure 1 further illustrates an opening 108 in the front surface of the housing 101, an OCR (optical character recognition) sensor 109 installed within the housing 101, a dual axis lead screw arrangement 110 installed within the housing 101, with an articulated L-shaped telescopic arm 111, a plurality of clippers 112 arranged over the plate 106, a holographic projection unit 113 installed over the housing 101,a microphone 114 is mounted over the housing 101 and writing instrument 115 at an end of the arm 111.
[0025] The device disclosed herein includes a housing 101 that is developed to be positioned on a flat surface in proximity to a user. The housing 101 is cuboidal in shape for ensuring the stability. The housing 101 herein incorporates all the components of the device required for assisting the users who are affected by learning disorders such as dyslexia and autism.
[0026] The housing 101 is installed with push button, accessed by the user to activate the device for performing the required operations. When the user presses the push button, the electrical circuit is completed, which in response turns the device on. The push button is integrated with an actuator and a spring, which are automatically activated when pressed. They work together to move the internal contact, completing the circuit and allowing electrical current to flow, thereby activating an inbuilt microcontroller.
[0027] The microcontroller associated with the device is pre-fed to detect the signal and actuate/activate the required component of the device. The microcontroller used herein is pre-fed using artificial intelligence and machine learning protocols to coordinate the working of the device. Further, the microcontroller activates a communication module for establishing a wireless connection, which is linked with the microcontroller for establishing a wireless connection between the microcontroller and a computing unit (includes, but not limited to smartphone, tablet or laptop) and inbuilt with a user-interface that is accessed by the user to provide input commands regarding learning specific instructions and educational material.
[0028] The communication module used herein includes, but not limited to Wi-Fi (Wireless Fidelity) module, Bluetooth module, GSM (Global System for Mobile Communication) module. The communication module used herein is preferably a Wi-Fi module that is a hardware component that enables the microcontroller to connect wirelessly with the computing unit. The Wi-Fi module works by utilizing radio waves to transmit and receive data over short distances. The core functionality relies on the IEEE 802.11 standards, which define the protocols for wireless local area networking (WLAN). Once connected, the module allows the microcontroller to send and receive data through data packets.
[0029] Post receiving the input commands from the computing unit, the microcontroller processes the input commands and activate a touch enabled display panel 103 installed over the housing 101 to display the user-specified instructions and educational material. The touch enabled display panel 103 as mentioned herein is typically an (Liquid Crystal Display) screen that presents output in a visible form regarding the instructions and educational material and enabling direct user interaction. The screen is equipped with touch-sensitive technology, allowing the user to interact directly with the display using their fingers. When the user touches the screen, a touch controller IC processes the resulting analog signals. The touch controller is typically connected to the microcontroller through various interfaces which may include but are not limited to SPI (Serial Peripheral Interface) or I2C (Inter-Integrated Circuit). Upon receiving the input commands from the user, the display panel 103 send the inputs to the microcontroller in form of electrical signal.
[0030] While the display panel 103 presents instructions and educational content, the microcontroller simultaneously activates a speaker 104 housed within the housing 101 to produce corresponding audio signals. This speaker 104 functions by converting electrical signals into audible sound. The speaker 104 comprises a diaphragm (a cone-shaped component) connected to a voice coil, which is positioned within a magnetic field created by two magnets. When the microcontroller sends an electrical signal to the voice coil, the coil generates a fluctuating magnetic field. This varying magnetic field interacts with the permanent magnets, causing the diaphragm to vibrate back and forth. This mechanical movement of the diaphragm displaces the surrounding air, generating sound waves that mirror the characteristics of the electrical signal used to produce the audio relevant to the displayed instructions and educational material.
[0031] While guiding the user with the instructions and educational material via the display panel 103 and speaker 104, the microcontroller activates an artificial intelligence-based imaging unit 105 mounted on the housing 101 to capture multiple images in proximity of the housing 101 to analyze the user facial expressions and identify potential difficulties the user encountering in understanding the material. The imaging unit 105 comprises of an image capturing module including a set of lenses that captures multiple images of the user’s face, and the captured images are stored within memory of the imaging unit 105 in form of an optical data.
[0032] The imaging unit 105 also comprises of a processor that is encrypted with artificial intelligence protocols, such that the processor processes the optical data and extracts the required data from the captured images. The extracted data is further converted into digital pulses and bits and are further transmitted to the microcontroller. The microcontroller processes the received data and determines potential difficulties the user encountering in understanding the material such as confusion, frustration, or engagement through facial expressions for determining the effectiveness of the tutoring session. Once determined the portion the user encountering the difficulty through their facial’s expressions, the microcontroller records the difficulty portion onto a connected memory linked with the microcontroller for future reference and re-directs the display panel 103 and the speaker 104 to repeat the portion of the study material the user encountering difficulty.
[0033] Once the user finishes the education session, then the microcontroller actuates a pair of sliding units 107 installed within the housing 101 to extend and position a plate 106 configured with the sliding units 107, to allow the user to place the written material onto the plate 106 for submission. The sliding units 107 consists of a pair of sliding rails fabricated with grooves in which the wheel of a slider is positioned that is further connected with a bi-directional motor via a shaft. The microcontroller actuates the bi-directional motor to rotate in a clockwise and anti-clockwise direction that aids in the rotation of the shaft, wherein the shaft converts the electrical energy into rotational energy for allowing movement of the wheel to translate over the sliding rail by a firm grip on the grooves. The movement of the sliding unit results in the translation of the plate 106 outward from the housing 101 via an opening 108 carved in the front surface of the housing 101.
[0034] Once the user places the written material on the plate 106, as detected via the imaging unit 105, the microcontroller actuates multiple clippers 112 (preferably in range 4-6) installed over the plate 106 to secure the placed written material over the plate 106. The clippers 112 are operated by a pair of handles which are alternately squeezed together and released for griping or releasing the written material and are driven by the motor which makes the blades of clip to oscillate from side to side. Upon actuation of the motorized clippers 112 by the microcontroller the motor rotates the blade to oscillate in order to grip the written material in order to secure the written material over the plate 106. Once the written material is firmly held on the plate 106 via the clippers 112, the microcontroller initiates the sliding units 107 to retrieve the plate 106 into the housing 101 for additional operations.
[0035] Once the written material is successfully submitted, the microcontroller activates an OCR (optical character recognition) sensor 109 arranged within the housing 101 to extract text from the submitted written material. The OCR (Optical Character Recognition) sensor 109 works by capturing an image of printed or handwritten text using a camera of the submitted material. The sensor 109 then processes the image using encrypted protocols that detect patterns, shapes, and characters. The sensor 109 first isolates the text from the background, then identifies individual characters based on their shapes and compares them to a database of known characters that is linked with microcontroller. The recognized characters are then processed by the microcontroller via an analysis module configured with the microcontroller to determine errors.
[0036] Upon receiving the extracted text from the OCR sensor 109, the analysis module is activated by the microcontroller to initiates error detection through several stages. First, the module conducts a lexical check, comparing each word against a dictionary to identify misspellings. Next, syntactic analysis examines the grammatical structure, flagging issues like incorrect word order or verb agreement. Semantic analysis then delves into the meaning, detecting contextually inappropriate word usage or logical inconsistencies by leveraging knowledge bases and language protocols. This multi-layered approach allows the module to identify a wide range of errors, from simple typos to more complex semantic mistakes for ensuring a more accurate interpretation of the extracted text. By combining these techniques, the analysis module provides a comprehensive error detection. Once the errors in the submitted material are detected, then the microcontroller store the detected error in a database is linked with the microcontroller, to determine change in frequency of each type of error committed over time.
[0037] When errors are identified in the submitted written material, the microcontroller activates an articulated L-shaped telescopic arm 111 installed within the housing 101 and equipped with a writing instrument 115 to annotates the material to highlight the detected errors for the user's reference. The arm’s 111 extension and retraction are controlled by a pneumatic unit, comprising an air compressor, cylinders, valves, and pistons. The microcontroller manages this unit by actuating valves to direct compressed air from the compressor into the cylinders. The resulting air pressure moves the pistons, which are mechanically linked to the arm 111, causing the rod to extend. To retract the arm 111, the microcontroller closes the valves, allowing the pistons to retract. This precise control over the pneumatic unit enables the microcontroller to accurately position the writing instrument 115 for annotating the identified errors on the written material.
[0038] While marking the errors on the written material, the microcontroller simultaneously actuates a dual axis lead screw arrangement 110 arranged within the housing 101, on which the arm 111 is installed, to enable precise movement of the arm 111. The dual-axis lead screw arrangement 110 comprises of two lead screws, each controlling movement along one axis (typically horizontal and vertical). As the lead screws rotate, they drive nuts along their threads, causing the arm 111 to move smoothly in both directions. The dual-axis control allows fine-tuned positioning of the arm 111, enabling the arm 111 to annotate the written material, precisely. Once the written material is annotated successfully, the microcontroller re-actuates the sliding units 107 to move the plate 106 out from the housing 101 to allow the user to access the annotated written material for taking notes.
[0039] Upon identified errors and annotation of the written material, the microcontroller activates a detection module linked with the microcontroller to analyze the database to determine the probability of the user having a learning disability includes but not limited to, such as dyslexia or autism by examining a range of data points. The module includes learning history, interaction patterns with educational materials, assessment results, behavioral observations, and demographic information. The module extracts key features relevant to specific learning disabilities, such as phonological processing difficulties for dyslexia or social interaction patterns for autism. The module then employs pattern recognition techniques, including statistical analysis and machine learning protocols trained on labeled data, to identify deviations from typical learning profiles and patterns associated with these conditions. Based on this analysis, the module generates a probability score or risk assessment, indicating the likelihood of the learning disability. This output serves as a flag for potential concern, strongly recommending a comprehensive evaluation by qualified professionals for an accurate diagnosis. The module's effectiveness relies on robust data, ethical handling of sensitive information, and recognition that its output is a probabilistic indicator, not a definitive diagnosis. In case the determined analyses to be the probably learning disability, then the microcontroller notifies the user regarding this by sending an alert in the form of electrical signal on the computing unit.
[0040] If the user selects for writing training via the display panel 103, then the user needs to place a blank paper on the plate 106 to draw a template for the user’s writing practice. As the blank paper is detected via the imaging unit 105, the microcontroller actuates the sliding units 107 and clippers 112 to secure and move in the plate 106 for enabling the arm 111 to draw the template on the paper for the user’s writing training, in conjunction with the lead screw arrangement 110. Once the template is prepared, then the microcontroller directs the sliding units 107 again to provide the template to the user for practice.
[0041] While the user practicing on the template as detected via the imaging unit 105, the microcontroller activates a holographic projection unit 113 mounted over the housing 101 to project images onto the paper on which the user practicing for training of writing, to provide guidance to the user. The holographic projection unit 113 uses a combination of laser light and digital modeling to project a 3D image onto the paper. The unit 113 generates a hologram of the design using a laser light source and a series of optical elements like beam splitters and mirrors. The hologram is projected onto the paper, creating a 3D image. Based on the surface contours of the paper, the unit 113 adjusting the hologram for alignment. The projected image serves as a guide, directing the user to follow the holographic design while writing practicing.
[0042] A microphone 114 is installed on the housing 101 that is activated by the microcontroller while tutoring session, to enable the user to provide vocal responses against questions displayed on the display panel 103. The microphone 114 contains a small diaphragm connected to a moving coil. When sound waves of the user hit the diaphragm, the coil vibrates. This causes the coil to move back and forth in the magnet's field, generating an electrical current. The signal of which are sent to the microcontroller for processing the input vocal responses of the user against questions. Once the user completed the tutoring session, then the user halt tutoring session via the computing unit.
[0043] In addition, if the user needs to relocate the housing 101 then the user provides commands regarding relocation of the housing 101 via the computing unit. Upon processing the user input command via the computing unit, the microcontroller actuates multiple motorised omnidirectional wheels 102 (preferably in range 4-6) arranged underneath the housing 101 to move the housing 101 towards the user-desired location. The omnidirectional wheels 102 consist of a wheel connected to a motor via a shaft and is engineered to allow movement in any direction without altering the housing’s 101 orientation, providing exceptional maneuverability. When the microcontroller actuates the wheels 102, the motor rotates either clockwise or counter-clockwise, transferring motion through the shaft to the wheel. This enables the housing 101 to move smoothly in any direction, making these wheels 102 highly effective for relocating and precisely positioning the housing 101.
[0044] In an embodiment of the present invention, multiple telescopically operated rods (preferably in range 4-6) are configured in between the wheels 102 and the housing 101, that are controlled by the microcontroller on the command of the user via the computing unit to extend and retract as per need. The rod herein is powered by the pneumatic unit associated with the device. The rods extend/retract in the same manner as the articulated L-shaped telescopic arm 111 described earlier.
[0045] Lastly, a battery (not shown in figure) is associated with the device to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrode named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the device.
[0046] In an exemplary embodiment of the present invention, the device works in a manner, where at the beginning of a study session, the microcontroller actuates the speaker 104 to issue a command such as “please take out your notebook” and activates the imaging unit 105 for monitoring the user time duration in comprehending and responding to the instruction displayed on the display panel 103. In case the response time is significantly delayed or the user appears to be struggling in understanding the instructions and educational material, then the microcontroller re-activates the display panel 103 and speaker 104 to repeat the portion the user struggling with, in slower and more accessible manner to enhance comprehension. On the daily basis, the microcontroller provides a series of small, interactive verbal questions via the speaker 104 and records the user’s response against the questions via the microphone 114 for analyzing the user’s response time and accuracy. for enhancing the user response time and cognitive development, the microcontroller engages the user in a fun-based activity such as simple command-response game includes but not limited to, such as sit and stand. This activity not only reinforce command comprehension but also makes the learning experience enjoyable and motivating for the users with the learning difficulties. For enhancing clarity and ensure better understanding, the microcontroller provides visual such as animated or symbolic representation on the display panel 103, in synchronization with the verbal instructions via the speaker 104. Thus, the user with the learning difficulties comprehend more easily the visual and verbal instruction provided by the device, thereby, ensure multimodal approach, improve engagement and learning outcomes for the users with learning difficulties such as dyslexia or autism.
[0047] The present invention works best in the following manner, where the housing 101 as disclosed in the invention incorporates the motorized omnidirectional wheels 102 for locomotion. The touch enabled display panel 103 for presenting instructions and educational material, accompanied by the speaker 104 that produces associated audio. The artificial intelligence-based imaging unit 105 captures user facial expressions to determine learning difficulty. Upon detecting difficulty, the imaging unit 105 triggers the microcontroller to record relevant study material portions onto the connected memory and subsequently prompts the display panel 103 and the speaker 104 to repeat that content. The plate 106 is attached within the housing 101 via the pair of sliding units 107, extends outward via the opening 108 to allow placement of written material for submission. The OCR (optical character recognition) sensor 109 is housed internally extracts text from the submitted material. The analysis module is configured with the microcontroller and receives this extracted text to identify errors.
[0048] In continuation, the dual axis lead screw arrangement 110 features the articulated L-shaped telescopic arm 111 with the writing instrument 115 for annotating the written material based on detected errors. The device also includes the clippers 112 on the plate 106 to secure written material. The database linked to the microcontroller for tracking error frequency, based on which the microcontroller re-actuates the lead screw arrangement 110 and the arm 111 to draw the writing templates on the blank paper. The holographic projection unit 113 for writing guidance. The microphone 114 for vocal responses against the question displayed on the display panel 103 and the detection module for analyzing the database to identify the probable learning disabilities. The user interface inbuilt in the computing unit enables remote communication with the housing’s 101 communication unit for session control, material management, and access to the database and detection module results.
[0049] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , C , Claims:1) An assistive tutoring device, comprising:
i) a housing 101 having a plurality of motorised omnidirectional wheels 102 installed underneath the housing 101 for a locomotion of the housing 101;
ii) a touch enabled display panel 103 mounted over the housing 101 for displaying instructions and educational material;
iii) a speaker 104 installed on the housing 101 to produce audio associated with the instructions and educational content displayed by the display panel 103;
iv) an artificial intelligence-based imaging unit 105, installed on the housing 101 and integrated with a processor for recording and processing images in a vicinity of the housing 101, captures facials expressions of the user, to determine difficulty in learning, to trigger a microcontroller to record onto a connected memory, portions of the study material said difficulty was indicated for to accordingly actuate the display panel 103 and the speaker 104 to repeat the portion;
v) a plate 106 attached within the housing 101 by means of a pair of sliding units 107, extended outwards from the housing 101 via an opening 108 in the front surface of the housing 101, to enable a user to place written material onto the plate 106 for submission;
vi) an OCR (optical character recognition) sensor 109 installed within the housing 101, for extracting text from the submitted written material;
vii) an analysis module configured with the microcontroller, receives the extracted text from the OCR sensor 109, to determine errors; and
viii) a dual axis lead screw arrangement 110 installed within the housing 101, with an articulated L-shaped telescopic arm 111 having a writing instrument 115 at an end to annotate the written material based on detected errors for reference of the user.
2) The device as claimed in claim 1, wherein a plurality of clippers 112 arranged over the plate 106 to secure the written material over the plate 106.
3) The device as claimed in claim 1, wherein a database is linked with the microcontroller to store the errors to determine change in frequency of each type of error committed over time.
4) The device as claimed in claim 1, wherein based on a command via the display panel 103, the lead screw arrangement 110 and the arm 111 are actuated to draw a template onto a blank paper for imparting training of writing to the user.
5) The device as claimed in claim 1, wherein a holographic projection unit 113 installed over the housing 101 to project images onto the paper being practiced on by the user for training of writing, to provide guidance to the user.
6) The device as claimed in claim 1, wherein a microphone 114 is mounted over the housing 101 to enable the user to provide vocal responses against questions displayed by the display panel 103.
7) The device as claimed in claim 1, wherein a detection module is configured with the microcontroller to analyse the database determines a probably learning disability suffered by the user.
8) The device as claimed in claim 1, wherein a user interface is adapted to be installed onto a computing unit to enable communication with a communication unit provided in the housing 101, to initiate and halt tutoring session, select and modify study material and access the database and the results generated by the detection module.
| # | Name | Date |
|---|---|---|
| 1 | 202521050909-STATEMENT OF UNDERTAKING (FORM 3) [27-05-2025(online)].pdf | 2025-05-27 |
| 2 | 202521050909-REQUEST FOR EXAMINATION (FORM-18) [27-05-2025(online)].pdf | 2025-05-27 |
| 3 | 202521050909-REQUEST FOR EARLY PUBLICATION(FORM-9) [27-05-2025(online)].pdf | 2025-05-27 |
| 4 | 202521050909-PROOF OF RIGHT [27-05-2025(online)].pdf | 2025-05-27 |
| 5 | 202521050909-POWER OF AUTHORITY [27-05-2025(online)].pdf | 2025-05-27 |
| 6 | 202521050909-FORM-9 [27-05-2025(online)].pdf | 2025-05-27 |
| 7 | 202521050909-FORM FOR SMALL ENTITY(FORM-28) [27-05-2025(online)].pdf | 2025-05-27 |
| 8 | 202521050909-FORM 18 [27-05-2025(online)].pdf | 2025-05-27 |
| 9 | 202521050909-FORM 1 [27-05-2025(online)].pdf | 2025-05-27 |
| 10 | 202521050909-FIGURE OF ABSTRACT [27-05-2025(online)].pdf | 2025-05-27 |
| 11 | 202521050909-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [27-05-2025(online)].pdf | 2025-05-27 |
| 12 | 202521050909-EVIDENCE FOR REGISTRATION UNDER SSI [27-05-2025(online)].pdf | 2025-05-27 |
| 13 | 202521050909-EDUCATIONAL INSTITUTION(S) [27-05-2025(online)].pdf | 2025-05-27 |
| 14 | 202521050909-DRAWINGS [27-05-2025(online)].pdf | 2025-05-27 |
| 15 | 202521050909-DECLARATION OF INVENTORSHIP (FORM 5) [27-05-2025(online)].pdf | 2025-05-27 |
| 16 | 202521050909-COMPLETE SPECIFICATION [27-05-2025(online)].pdf | 2025-05-27 |
| 17 | Abstract.jpg | 2025-06-12 |