Sign In to Follow Application
View All Documents & Correspondence

Multi Modal Typing Assistive System

Abstract: A multi-modal typing assistive system is comprising, a housing 101 with a hinged lid 102, housing a typewriter 103 on a pair of sliding units 104 for access, a pneumatic pusher 105 for locking, a vibration motor 106 for haptic feedback, a braille printing unit 107 with a grid of pneumatic pins 107b prints braille, a flap 109 with a pair of clamps 110 holds a document, a microphone 111 receives voice commands to move the typewriter 103, a plurality of rubber pads 112 prevent slippage, a pair of articulated telescopic bars 114 with C-shaped plates 113 guide wrist position, a chamber 115 stores paper, fetched and inserted by a robotic arm 116, a slider 117 with an articulated telescopic limb 118 and suction cup 119 flips pages, an artificial intelligence based camera 120 captures document and typed text to detect errors, activated a speaker 122 for notification.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 May 2025
Publication Number
25/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Marwadi University
Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Inventors

1. Ronit Motivaras
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
2. Chandrasinh D Parmar
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
3. Nishith Kotak
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to a multi-modal typing assistive system that is capable of aiding users in text input through various methods, including both standard and tactile formats, and providing real-time feedback and error correction.

BACKGROUND OF THE INVENTION

[0002] Diverse challenges are faced by individuals with impairments. Visual impairments hinder the ability to read source text and typed content, leading to errors. Motor disabilities makes precise keystrokes difficult, resulting in mistyping. Traditional typing methods often lack adequate feedback and error correction for these users. A “multi-modal” term means addresses these issues by offering alternative input methods like voice commands, tactile feedback through Braille and haptic vibrations, and error detection. Challenges include designing intuitive interfaces, ensuring accurate text recognition across varying conditions, and providing seamless integration of multiple modalities to improve user experience and independence in text creation

[0003] Traditionally, impaired individuals use adapted typewriters with features like key guards to prevent accidental presses or Braille overlays on keys for tactile typing. However, these systems lack automation. Referencing documents requires manual placement and page turning. Error detection relies solely on the user's limited senses or external assistance, without real-time feedback or automated correction. Generating Braille output necessitates separate means or manual Braille typewriters. These manual methods are often slow, prone to errors, and lack the integrated assistance for referencing, error correction, and multi-format output that an automated means provide, significantly hindering efficiency and independence.

[0004] US7706509B2 discloses about the keyboard for blind people comprises a body carrying a plurality of keys and interface means for its connection to an external unit. The keys have a number of portions each having associated a function of the key and having represented such a function in Braille characters thereon. Moreover, the keys are jerky rotatable connected to the body so as to select one of the portions of the key and are also pressable in order to activate that function associated to the activable portion, so that each function is selected through rotations and subsequent pressure of the corresponding key portion.

[0005] US3611586 discloses about a typewriter capable of being used as an instructional device for correlating visual images with printed words by means of which a student can type the word identifying the visual image and thereafter compare the printed word associated with the visual image with the word the student has typed to verify the accuracy of the typed word.

[0006] Conventionally, many systems are available for assisting visually impaired individuals in typing. However, these systems lack in providing integration of multiple assistive features into a single, automated device, often requiring users to rely on separate tools for document referencing, error correction, and Braille output. This lack of a unified approach leads to a fragmented and less efficient typing experience.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that is capable of providing a comprehensive and automated typing assistance solution for impaired individuals, integrating functionalities for document handling, error detection, multi-modal feedback, and versatile output options.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a system that improves typing accuracy by preventing incorrect key presses by detecting intended text and helps guide correct key selection during typing.

[0010] Another object of the present invention is to develop a system that is capable of assisting visually impaired users by enabling braille printing directly from the typed text, thus allowing easy access to printed material in braille format

[0011] Another object of the present invention is to develop a system that is capable of enhancing user comfort and maintain proper wrist alignment by supporting correct posture while typing, especially during extended periods of continuous use.

[0012] Yet, another object of the present invention is to develop a system that is capable of improving user convenience by allowing voice commands and automating paper loading and handling, thus reducing the need for manual effort during typing tasks.

[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0014] The present invention relates to a multi-modal typing assistive system that improves text entry for users through a combination of visual and tactile modalities, offering features such as document referencing, error detection, and output in both standard and accessible formats. Thus, provides a comprehensive and user-friendly typing experience.

[0015] According to an embodiment of the present invention, a multi-modal typing assistive system is disclosed comprising of, a housing configured with a hinged lid, having a typewriter positioned within the housing, a pair of sliding units arranged within the housing for an inward and outward sliding of the typewriter for access, a pressure sensor embedded on each of the keys of the typewriter, to detect keypresses, a pneumatic pusher provided underneath each of the key caps of the typewriter for automated locking of the keys, a text correction module configured with the control unit to detect text being typed via the pressure sensors to predict upcoming text to be typed to accordingly actuate the pushers to extend and lock keys unrelated to the predicted text to prevent mistyping, a vibration motor configured with each of the keys to provide haptic feedback to the user upon pressing of the keys, a braille printing unit installed with the typewriter by means of a support frame, comprising a printing bar having a grid of pneumatic pins to print braille pattern onto the paper in a braille mode, in response to detected text being type based on reading of the pressure sensors, a flap attached with a lateral potion of the housing, having a pair of clamps mounted on the flap for gripping a document placed on the flap for typing from.

[0016] According to another embodiment of the present invention, the present system is further comprising of, a microphone provided over the housing, connected with a control unit to receive voice commands from a user to actuate the sliding unit to translate the typewriter inwards and outwards, a plurality of rubber pads are arranged over the sliding units to prevent a slippage of the typewriter, a pair of C-shaped plate is attached with a front edge of the housing by means of articulated telescopic bars, to guide wrist position of the user into a correct position while typing, based on positon of the user’s wrist detected by the camera, a chamber is provided at a rear portion of the housing to store papers for typing onto, a robotic arm is installed with the rear portion of the housing to fetch the paper from the chamber and insert into the typewriter, a slider installed horizontally over the flap with an articulated telescopic limb mounted on the slider, having a suction cup at an end to flip pages of the document, an artificial intelligence based camera is installed on the housing by means of an articulated telescopic arm and configured with an OCR (optical character recognition) module to capture text of the document from which the user is typing text and capture the text being typed by the user to determine errors, a speaker mounted on the housing to generate an audio alert regarding the detected error.

[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a multi-modal typing assistive system.

DETAILED DESCRIPTION OF THE INVENTION

[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0022] The present invention relates to a multi-modal typing assistive system that facilitating text input by enabling users to reference documents, receive real-time feedback on typed content, and generate output in multiple formats, thereby enhancing efficiency and accessibility in text creation.

[0023] Referring to Figure 1, an isometric view of a multi-modal typing assistive system is illustrated, comprising, a housing 101 configured with a hinged lid 102, having a typewriter 103 positioned within the housing 101, a pair of sliding units 104 arranged within the housing 101, a pneumatic pusher 105 provided underneath each of the key caps of the typewriter 103, a vibration motor 106 configured with each of the keys, a braille printing unit 107 installed with the typewriter 103 by means of a support frame 108, comprising a printing bar 107a having a grid of pneumatic pins 107b, a flap 109 attached with a lateral potion of the housing 101, having a pair of clamps 110 mounted on the flap 109, a microphone 111 is provided over the housing 101, a plurality of rubber pads 112 are arranged over the sliding units 104, a pair of C-shaped plate 113 is attached with a front edge of the housing 101 by means of articulated telescopic bars 114, a chamber 115 is provided at a rear portion, a robotic arm 116 is installed with the rear portion of the housing 101, a slider 117 installed horizontally over the flap 109 with an articulated telescopic limb 118 mounted on the slider 117, having a suction cup 119, an artificial intelligence based camera 120 is installed on the housing 101 by means of an articulated telescopic arm 121, and a speaker 122 mounted on the housing 101.

[0024] The system disclosed herein includes a housing 101 that is developed to be positioned on a flat surface in proximity to a user. The housing 101 herein incorporates all the components of the system to enable visually impaired users to type in both standard and Braille languages onto a paper.

[0025] The housing 101 is installed with push button, accessed by the user to activate the system for performing the required operations. When the user presses the push button, the electrical circuit is completed, which in response turns the system on. The push button is integrated with an actuator and a spring, which are automatically activated when pressed. They work together to move the internal contact, completing the circuit and allowing electrical current to flow, thereby activating a control unit includes but not limited to, a microcontroller.

[0026] The microcontroller associated with the system is pre-fed to detect the signal and actuate/ activate the required component of the system. The microcontroller used herein is pre-fed using artificial intelligence and machine learning protocols to coordinate the working of the system. Further, the microcontroller activates a pair of clamps 110 located on a flap 109 installed at the housing's side to secure a document placed there by the user for typing reference.

[0027] The clamps 110 are powered by a DC motor, which the microcontroller activates by supplying the required electric current. Inside the motor, a coil transforms this electrical energy into a magnetic field, thereby generating mechanical force. This mechanical force is then used by the clamps 110 to firmly grip the document placed by the user on the flap 109, ensuring it remains stable for typing reference.

[0028] Upon securing the user’s reference document over the flap 109, the microcontroller actuates an articulated telescopic arm 121 installed with the housing 101 to position an artificial intelligence based camera 120 configured with the arm o above the document for source text understanding. The arm having articulated joints, which allows for flexible movement and precise positioning of the camera 120. The telescopic arm 121 is linked to a pneumatic unit, including an air compressor, air cylinders, air valves and piston which works in collaboration to aid in extension and retraction of the arm. The pneumatic unit is operated by the microcontroller, such that the microcontroller actuates valve to allow passage of compressed air from the compressor within the cylinder, the compressed air further develops pressure against the piston and results in pushing and extending the piston. The piston is connected with the arm and due to applied pressure the arm extends and similarly, the microcontroller retracts the telescopic arm 121 by closing the valve resulting in retraction of the piston. Thus, the microcontroller regulates the extension/retraction of the arm in order to position the camera 120 over the document.

[0029] Once the camera 120 is positioned above the document, the microcontroller then activates the camera 120 that is configured with an OCR (optical character recognition) module to capture multiple images of text of the document from which the user uses for typing text. This allow the system to understand the source text and detect errors in the text that being typed by the user.

[0030] The camera 120 comprises of an image capturing module, featuring a set of lenses that acquire multiple images of the document. These captured images are stored as optical data within the camera’s memory. Crucially, the camera 120 also incorporates a processor that is not only encrypted with artificial intelligence protocols but also integrated with an Optical Character Recognition (OCR) module. This processor analyzes the stored optical data, leveraging the artificial intelligence and OCR capabilities to identify and extract the textual information from the images. The extracted text is then transformed into digital pulses and bits, are further transmitted to the microcontroller. The microcontroller subsequently processes this received digital data to perform its designated functions, such as error detection in the user's typing based on the recognized source text. This integration of image capture, OCR processing, and digital conversion enables efficient text recognition and data transfer within the system.

[0031] The Optical Character Recognition (OCR) sensor functions by processing the captured images through a series of steps to identify and convert the visual representation of characters into machine-readable text. Initially, the image undergoes preprocessing to improve its quality, which might involve adjusting contrast, reducing noise, or correcting skew. Next, the OCR protocol analyzes the image to locate regions containing text. Once text areas are identified, the protocol segments the text into individual lines and then into individual characters. Character recognition is the core of the process, where protocols compare the shape and features of each segmented character against a database of known characters that is linked with the microcontroller. This comparison often involves feature extraction, where unique characteristics of each character (like loops, lines, and curves) are identified and matched. Finally, the recognized characters are assembled into words and sentences, generating a digital text output that further processed or utilized by the system.

[0032] A slider 117 is provided horizontally over the flap 109 with an articulated telescopic limb 118 configured on the slider 117, having a suction cup 119 at an end, is actuated by the microcontroller to flip the pages of the document, enabling scanning of the entire document. The extension and retraction of the telescopic limb 118 works in same manner as disclosed above, and is regulated by the microcontroller to flip the pages of the document by the suction cup 119 when needed.

[0033] The silicone rubber suction cup 119 efficiently creates a vacuum seal by expelling air, ensuring a firm, slip-resistant grip on the page. This is engineered for easy attachment and release, these suction cup 119 maintain their holding power, facilitating convenient page flipping as needed.

[0034] In conjunction with the limb 118 movement, the microcontroller controls the slider 117 for accurate page flipping. The slider 117 comprises a pair of grooved sliding rails that guide the wheels of a sliding unit. This unit is connected to a bi-directional motor via a shaft. The microcontroller drives the motor to rotate clockwise or counter-clockwise, which in turn rotates the shaft. The shaft converts electrical energy into rotational motion, causing the wheels to move along the grooves of the sliding rails with a secure grip. Consequently, the movement of the sliding unit translates the articulated telescopic limb 118 across the flap 109.

[0035] After processing the source text, the microcontroller activates a microphone 111 integrated into the housing 101 to receive voice commands the user for providing a typewriter 103 positioned within the housing 101. This microphone 111 contains a diaphragm connected to a moving coil. When the user's voice waves strike the diaphragm, the coil vibrates within a magnetic field, generating an electrical current signal. This signal is then transmitted to the microcontroller, which processes the user's voice command regarding to provide the typewriter 103.

[0036] Upon processing the user’s voice command, the microcontroller actuates a hinged lid 102 on the housing 101 to open its front. This hinge consists of a pair of leaves, each screwed to the lid and the housing 101. These leaves are connected by a cylindrical member integrated with a shaft, which is coupled to a DC motor to power the hinge's movement. Clockwise and counter-clockwise rotation of the shaft causes the hinge to open and close the lid, respectively. Therefore, the microcontroller actuates the hinge, which in turn moves the lid to open the front of the housing 101.

[0037] With the housing 101 open, the microcontroller actuates a pair of sliding units 104 installed inside the housing 101 to move the typewriter 103 inward and outward. The sliding units 104 works in same manner as the slider 117 disclosed above, that is actuated by the microcontroller to translate the typewriter 103 outward from the housing 101. Multiple rubber pads 112 (preferably in range 4-6) are arranged with the sliding units 104 to prevent a slippage of the typewriter 103.

[0038] Based on the user's wrist position detected by a camera 120, the microcontroller actuates a pair of articulated telescopic bars 114 installed with the front edge of the housing 101 and each configured with a C-shaped plate 113 to guide the wrist into a correct typing position for typing. The telescopic bars 114 work in same manner as the telescopic arm 121 as disclosed above, that is controlled by the microcontroller to position the plate 113 in contact with the user’s wrist to guide into the correct position while typing.

[0039] As the user presses the typewriter’s keys, the microcontroller activates a pressure sensor embedded in each key to detect the keypresses. The pressure sensor comprises of a sensing element known as diaphragm that experiences a force exerted by the user’s finger on the typewriter 103 keys while typing. This force leads to deflection in the diaphragm that is measured by the sensor and converted into an electrical signal which is sent to the microcontroller for detecting the keypresses.

[0040] Upon the user pressing a key, the pressure sensor detects the input, and the microcontroller immediately actuates a corresponding vibration motor 106 configured with each of the keys to provide haptic feedback. This vibration motor 106, includes but not limited to, an eccentric rotating mass (ERM) motor or a linear resonant actuator (LRA), generates mechanical vibrations upon receiving an electrical current. The microcontroller selectively activates these motors 106 in real time based on detected keypresses, sending electrical pulses to the appropriate motor 106. This resulting vibration on the pressed key serves as tactile confirmation of a successful input.

[0041] When the key is pressed by the user, the microcontroller activates a text correction module configured with the microcontroller to analyze the typed text and predicts upcoming text. The text correction module analyzes the text being typed by the user in real-time by employing various natural language processing (NLP) techniques. The text correction module typically maintains a buffer of recently typed words and characters. As new input arrives, the module compares it against a vast lexicon and statistical language models. These language models, often trained on massive datasets of text, capture the probabilities of word sequences and the likelihood of certain words following others. By identifying potential misspellings based on dictionary lookups and calculating the statistical probability of the current word sequence, the module suggests corrections for misspelled words. Furthermore, to predict upcoming text, the module leverages the same language models. Based on the context of the already typed words, it analyzes the probabilities of various words that might logically follow. The module considers grammatical rules, common phrases, and contextual relevance to offer suggestions for the next word or even a short phrase, aiming to improve typing speed and accuracy. The module is also capable providing personalize predictions based on the user's typing history and patterns by employing machine learning protocols.

[0042] Based on the prediction from the text correction module, the microcontroller actuates a pneumatic pusher 105 provided underneath of each of the key caps of the typewriter 103 to extend and lock keys that are unrelated to the predicted text, aiming to prevent mistyping. The extension and retraction of the pneumatic pusher 105 work in same manner as the telescopic arm 121 by employing the pneumatic unit that is being controlled by the microcontroller, to lock the unrelated keys for preventing mistakes. Simultaneously, the camera 120 also captures the text being typed by the user and compares this typed text with the source text captured earlier to determine errors.

[0043] Upon detection of the errors, the microcontroller actuates a speaker 122 installed on the housing 101 to generate an audio alert to notify the user regarding the detected error. The speaker 122 works by converting the electrical signal into the audio signal. The speaker 122 consists of a cone known as a diaphragm attached to a coil-shaped wire placed between two magnets. When the electric signal is passed through the voice coil, a varying magnetic field is generated by the coil that interacts with the magnet causing the diaphragm to move back and forth. The movement of the diaphragm pushes and pulls air creating sound waves just like the electrical signal received and used to notify the user for error detection.

[0044] When the user specified a braille typing mode via the microphone 111, the microcontroller actuates a robotic arm 116 installed with the rear portion of the housing 101 to lock the carriage of the typewriter 103, likely to ensure proper braille alignment and spacing. The robotic arm 116 comprises of a robotic link and a clamp attached to the link. The robotic link is made of several segments that are attached together by joints also referred to as axes. Each joint of the segments contains a step motor that rotates and allows the robotic link to complete a specific motion of the arm. Upon actuation of the robotic arm 116 by the microcontroller, the motor drives the movement of the clamp to lock the carriage of the typewriter 103.

[0045] Once the carriage of the typewriter 103 is locked via the robotic arm 116, the microcontroller then activates a braille printing unit 107 installed with the typewriter 103 by means of a support frame 108, to print the braille pattern onto the paper. The braille printing unit 107 comprising a printing bar 107a having a grid of pneumatic pins 107b to print braille pattern onto the paper, based on the text detected from the user's typing as read by the pressure sensors. The extension and retraction of the pneumatic pins 107b works in same manner as the telescopic arm 121 as disclosed above by employing the pneumatic unit, that is actuated by the microcontroller to print the braille pattern on the paper.

[0046] A chamber 115 is provided at a rear portion of the housing 101 to store papers for typing onto. The paper is fetch by the robotic arm 116 from the chamber 115 and inserted into the typewriter 103.

[0047] In an embodiment of the present invention, multiple suction cups (preferably in range 4-6) are installed underneath the housing 101 for adhering the housing 101 over the surface. The multiple suction cups work in same manner as the suction cup 119 disclosed above, for securing the housing 101 over the surface.

[0048] Lastly, a battery (not shown in figure) is associated with the system to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrode named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the system.

[0049] The present invention work best in the following manner, where the housing 101 is configured with the hinged lid 102 and includes the typewriter 103 mounted on the pair of sliding units 104 to enable inward and outward translation for ease of access. Each key of the typewriter 103 is embedded with the pressure sensor to detect keypresses and integrated with the pneumatic pusher 105 for automated locking of the keys based on text predictions generated by the text correction module. The vibration motor 106 is configured beneath each key to provide haptic feedback during typing. The system features the braille printing unit 107 installed on the support frame 108 with the typewriter 103, comprising the printing bar 107a having the grid of pneumatic pins 107b to print braille patterns on paper in the braille mode. The robotic arm 116 locks the carriage of the typewriter 103, when the braille mode is activated by the user. The flap 109 attached laterally to the housing 101 includes the pair of clamps 110 for holding the document while typing. The microphone 111 connected to the control unit receives voice commands to actuate the sliding unit. The rubber pads 112 are arranged over the sliding units 104 to prevent typewriter 103 slippage. The pair of C-shaped plates 113 is attached to the front edge of the housing 101 through articulated telescopic bars 114 to guide the user's wrist into the correct typing position based on wrist location detected by the camera 120. The chamber 115 stores paper, and the robotic arm 116 fetches and loads paper into the typewriter 103. The horizontally installed slider 117 over the flap 109 includes the articulated telescopic limb 118 with the suction cup 119 to flip document pages. The camera 120 configured with the OCR module detects typed and source text, triggering the speaker 122 to alert the user of typing errors.

[0050] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention.
, Claims:1) A multi-modal typing assistive system, comprising:

i) a housing 101 configured with a hinged lid 102, having a typewriter 103 positioned within the housing 101;

ii) a pair of sliding units 104 arranged within the housing 101 for an inward and outward sliding of the typewriter 103 for access;

iii) a pressure sensor embedded on each of the keys of the typewriter 103, to detect keypresses;

iv) a pneumatic pusher 105 provided underneath each of the key caps of the typewriter 103 for automated locking of the keys;

v) a text correction module configured with the control unit to detect text being typed via the pressure sensors to predict upcoming text to be typed to accordingly actuate the pushers 105 to extend and lock keys unrelated to the predicted text to prevent mistyping;

vi) a vibration motor 106 configured with each of the keys to provide haptic feedback to the user upon pressing of the keys;

vii) a braille printing unit 107 installed with the typewriter 103 by means of a support frame 108, comprising a printing bar 107a having a grid of pneumatic pins 107b to print braille pattern onto the paper in a braille mode, in response to detected text being type based on reading of the pressure sensors, wherein in the braille mode the robotic arm 116 is actuated to lock the carriage of the typewriter 103; and

viii) a flap 109 attached with a lateral potion of the housing 101, having a pair of clamps 110 mounted on the flap 109 for gripping a document placed on the flap 109 for typing from.

2) The system as claimed in claim 1, wherein a microphone 111 is provided over the housing 101, connected with a control unit to receive voice commands from a user to actuate the sliding unit to translate the typewriter 103 inwards and outwards.

3) The system as claimed in claim 1, a plurality of rubber pads 112 are arranged over the sliding units 104 to prevent a slippage of the typewriter 103.

4) The system as claimed in claim 1, wherein a pair of C-shaped plate 113 is attached with a front edge of the housing 101 by means of articulated telescopic bars 114, to guide wrist position of the user into a correct position while typing, based on positon of the user’s wrist detected by the camera 120.

5) The system as claimed in claim 1, wherein a chamber 115 is provided at a rear portion of the housing 101 to store papers for typing onto.

6) The system as claimed in claim 1, wherein a robotic arm 116 is installed with the rear portion of the housing 101 to fetch the paper from the chamber 115 and insert into the typewriter 103.

7) The system as claimed in claim 1, wherein a slider 117 installed horizontally over the flap 109 with an articulated telescopic limb 118 mounted on the slider 117, having a suction cup 119 at an end to flip pages of the document.

8) The system as claimed in claim 1, wherein an artificial intelligence based camera 120 is installed on the housing 101 by means of an articulated telescopic arm 121 and configured with an OCR (optical character recognition) module to capture text of the document from which the user is typing text and capture the text being typed by the user to determine errors, to actuate a speaker 122 mounted on the housing 101 to generate an audio alert regarding the detected error.

Documents

Application Documents

# Name Date
1 202521052017-STATEMENT OF UNDERTAKING (FORM 3) [29-05-2025(online)].pdf 2025-05-29
2 202521052017-REQUEST FOR EXAMINATION (FORM-18) [29-05-2025(online)].pdf 2025-05-29
3 202521052017-REQUEST FOR EARLY PUBLICATION(FORM-9) [29-05-2025(online)].pdf 2025-05-29
4 202521052017-PROOF OF RIGHT [29-05-2025(online)].pdf 2025-05-29
5 202521052017-POWER OF AUTHORITY [29-05-2025(online)].pdf 2025-05-29
6 202521052017-FORM-9 [29-05-2025(online)].pdf 2025-05-29
7 202521052017-FORM FOR SMALL ENTITY(FORM-28) [29-05-2025(online)].pdf 2025-05-29
8 202521052017-FORM 18 [29-05-2025(online)].pdf 2025-05-29
9 202521052017-FORM 1 [29-05-2025(online)].pdf 2025-05-29
10 202521052017-FIGURE OF ABSTRACT [29-05-2025(online)].pdf 2025-05-29
11 202521052017-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-05-2025(online)].pdf 2025-05-29
12 202521052017-EVIDENCE FOR REGISTRATION UNDER SSI [29-05-2025(online)].pdf 2025-05-29
13 202521052017-EDUCATIONAL INSTITUTION(S) [29-05-2025(online)].pdf 2025-05-29
14 202521052017-DRAWINGS [29-05-2025(online)].pdf 2025-05-29
15 202521052017-DECLARATION OF INVENTORSHIP (FORM 5) [29-05-2025(online)].pdf 2025-05-29
16 202521052017-COMPLETE SPECIFICATION [29-05-2025(online)].pdf 2025-05-29
17 Abstract.jpg 2025-06-17