Abstract: A multimodal ergonomic input device is comprising, a handheld unit 101 with an ergonomic shape conforming to a user’s palm, featuring at least one clickable button 102, a platform 103 interfaces with the handheld unit 101 and includes suction units 104 at its base for stable surface adherence, an imaging unit 105 monitors screen activity and user behavior in real time, at least one solenoid actuator 106, coupled with the click buttons, automates clicking, a thermal regulation unit 109 with a temperature sensor 109a and Peltier unit 109b maintains optimal cushion padding 110 temperature, the platform 103 integrates a ball transfer conveyor 111 with rotating ball elements and touch sensors for cursor control, a motorized gear train scrolls via gestures, a health monitoring unit 113 provides real-time analytics, a thermocouple triggers a cooling fan 115 based on heat feedback.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to peripheral devices and in particular to a multimodal ergonomic input device that is developed to ease the user interaction with the computing means while ensuring limited usage as per user type.
BACKGROUND OF THE INVENTION
[0002] In today’s digital environment, users frequently engage with computing devices for extended periods, leading to physical strain, reduced efficiency, and increased risk of repetitive stress injuries. Traditional input devices often lack adaptability to user behaviour, emotional state, or environmental context, resulting in discomfort and inefficient interaction. The growing need for intuitive, health-conscious, and responsive means has led to demand for multimodal ergonomic input devices that combine physical comfort with automated interaction. However, users face challenges such as limited gesture recognition, poor adaptability to stress indicators, lack of real-time feedback. Addressing these issues requires integration of multiple input modes, behavioural analysis, and adaptive feedback means to improve comfort, safety, and overall interaction quality.
[0003] Traditionally, input devices such as standard computer mice, trackpads, and basic ergonomic controllers offer limited functionality focused primarily on physical comfort or cursor control. While some models include basic gesture recognition or voice input, they often lack real-time behavioral analysis, adaptive response means, or integrated health monitoring. Devices like vertical mice or stylus-based means reduce wrist strain but fail to address cognitive load or user stress. Others with gesture support struggle with accuracy in dynamic environments. Additionally, most devices do not integrate multiple modalities—such as voice, gesture, and touch—into a seamless interface. These limitations highlight the need for a more comprehensive, automated input solution that improves interaction while supporting the user’s physical and mental well-being.
[0004] US6556150B1 discloses about a computer input device that comfortably supports the hand of the user while the thumb and fingers are associated with buttons, a trackball, and a scrolling wheel carried on the device. The overall configuration of the device and the arrangement of these actuators permit the user to easily and effectively operate all of the functions provided by the actuators while the hand and arm of the user are in a comfortable position. The device includes a housing having an asymmetrical dividing ridge forming a “thumb-side surface,” for placement of the thumb, and a “finger-side surface,” for placement of the remaining four fingers. The finger-side surface includes a ball rotatably mounted in the housing and further includes first and second input buttons disposed to the side of the ball which may be used as “forward” and “back” buttons. The first and second input buttons adjacent to the ball may be replaced by a linearly slidable or pivotally movable actuator. The movable member is preferably of the three-position-type enabling the user to activate the movable member in a forward or a backward direction for transmission of respective signals to the computer. The thumb-side surface includes a horizontally-disposed scrolling wheel and third and fourth buttons on opposite sides of the scrolling wheel that can be used as “primary” and “secondary” buttons. The primary button has a curved bottom portion that serves as a thumb rest, and may be used as such without activating the button as the button is movable laterally towards the fingers.
[0005] US4862165A discloses about an ergonomically-shaped hand controller of the type commonly referred to as a "mouse", is specially configured to prevent or reduce hand muscle fatigue despite continuous use over a protracted period of time. A housing for receiving the anterior surface of the hand comprises an arched metacarpalphalangeal support surface, distal phalange support surfaces for the volar pads of the thumb and forefinger and a medial ledge for supporting the remaining three ulnar fingers in a partially wrapped configuration with flexion of the distal, middle and proximal phalanges.
[0006] Conventionally, many devices are available in market that provides multimodal and ergonomic input means. However, these devices are not compatible with each and every computing device. These devices not only require substantial changes in the existing computing devices i.e. laptops, computers etc., but also require extensive manufacturing and programming cost.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a device that is capable of enabling dynamic user interaction with the computing devices without any structural or software based changes in the existing computing devices, thus promoting industrial application.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop a device that is capable of offering an input means that ensures smooth, accurate, and efficient control during computing tasks while reducing physical strain through ergonomic design and adaptive interaction features.
[0010] Another object of the present invention is to develop a device that continuously observe user behavior and screen activity in real time, enabling the device to respond appropriately and improve both user engagement and digital safety.
[0011] Another object of the present invention is to develop a device that is capable of enabling automatic control of device functions based on detected user inputs or conditions, improving operational accuracy, responsiveness, and ease of use during extended usage.
[0012] Yet, another object of the present invention is to develop a device that is capable regulating ambient conditions to provide convenience to the user while operating the device.
[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0014] The present invention relates to a multimodal ergonomic input device that is capable of adapting to user needs, ensuring comfort, automating interactions, and actively monitoring both user behavior and digital content for improved safety and efficiency.
[0015] According to an embodiment of the present invention, a multimodal ergonomic input device is comprising, a handheld unit having an ergonomic shape conforming to a user’s palm configured as a user-operable input means for a computing unit, the handheld unit comprising at least one clickable button, a platform adapted to interface with the handheld unit, the base platform having suction units disposed at its bottom portion for adherence with a stable surface, an imaging unit linked with an analysis module integrated with the handheld unit and configured to monitor screen activity and user behavior in real time, at least one actuator coupled with the click buttons of the handheld unit to automate clicking operations upon receiving input from at least one of a voice command module and gesture recognition module linked with the imaging unit and lock clicking operation as per requirement, a gesture controlled scrolling means arranged on the handheld unit and actuated by user gestures and modulated by a deep learning protocol, the analysis module is trained to detect inappropriate screen content and, upon detection, arrests the actuators for user interaction and transmits an alert to a guardian’s interface wirelessly linked with the device, a thermal regulation unit comprising a temperature sensor and a Peltier unit, arranged within the handheld unit to maintain optimal operational temperature of the cushion padding fabricated over the handheld unit for providing comfort to the user during usage of the handheld unit.
[0016] According to another embodiment of the present invention, the present device is further comprising, the platform is integrated with a ball transfer conveyor that enables cursor movement upon reaching a physical boundary of the platform by the handheld unit, as detected via at least one touch sensor installed along the physical boundary of the platform, the ball transfer conveyor comprises rotating ball elements embedded in the platform surface and configured to interface with the handheld unit, the scrolling means comprises a motorized gear train arrangement located between click buttons and operable through hand or finger gestures recognized by the imaging unit, the ball transfer conveyor is electrically connected to a rotation sensing unit comprising at least one rotary encoder or Hall-effect sensor configured to detect angular displacement and rotation speed of individual ball elements, thereby enabling precise tracking of cursor movement or device position, a health monitoring unit embedded within the handheld unit and configured to monitor at least heart rate, blood oxygen levels, and skin conductance via corresponding biosensors to provide real-time health analytics on a projection unit installed with the handheld unit, the gesture controlled scrolling means comprises a deep learning based control module trained to dynamically adapt scrolling speed in real time according to the velocity, frequency, and direction of user gestures captured by the imaging unit, the platform includes a thermocouple for detecting dissipated heat from the handheld unit and initiates activation of cooling fan mounted with the platform to dissipate excess heat based on feedback from thermocouple.
[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a multimodal ergonomic input device.
DETAILED DESCRIPTION OF THE INVENTION
[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0022] The present invention relates to a multimodal ergonomic input device that is capable of providing intuitive user interaction, enhancing comfort during prolonged use, monitoring user behavior and environment in real time, and adapting input functions to improve efficiency, safety, and overall user experience.
[0023] Referring to Figure 1, an isometric view of a multimodal ergonomic input device is illustrated, comprising, a handheld unit 101 having an ergonomic shape, the handheld unit 101 comprising at least one clickable button 102, a platform 103 adapted to interface with the handheld unit 101, the base platform 103 having suction units 104, an imaging unit 105 integrated with the handheld unit 101, at least one actuator 106 coupled with the click buttons of the handheld unit 101, a voice command module 107 integrated with the platform 103, a gesture controlled scrolling means 108 arranged on the handheld unit 101, a thermal regulation unit 109 comprising a temperature sensor 109a and a Peltier unit 109b, arranged within the handheld unit 101, a cushion padding 110 fabricated over the handheld unit 101, the platform 103 is integrated with a ball transfer conveyor 111, the scrolling means 108 comprises a motorized gear train arrangement 112 located between click buttons, a health monitoring unit 113 embedded within the handheld unit 101, a projection unit 114 installed with the handheld unit 101, a cooling fan 115 mounted with the platform 103.
[0024] The device disclosed herein includes a handheld unit 101 developed with an ergonomic shape that conforms efficiently to a user’s palm, and a platform 103 adapted to interface with the handheld unit 101. Both the handheld unit 101 and the platform 103 incorporate all necessary components to provide an ergonomic input means that can be connected with a computing means.
[0025] The user places the handheld unit 101 onto the platform 103 for seamless interaction and precise input control. The handheld unit 101 includes two clickable buttons 102 to enable user commands and selections over the computing means. The platform 103 is secured to a stable surface by suction units 104 located on its bottom portion, ensuring a stable base.
[0026] The suction units 104 functions as a securing means that holds the platform 103 firmly in place on a flat, stable surface. The suction units 104 operate based on the principle of creating a vacuum seal between the suction pad and the surface below. Each suction unit typically comprises a flexible rubber or silicone cup that, when pressed against the surface, expels the air trapped inside. Once the pressure is released, the absence of air between the suction cup and the surface creates a low-pressure zone, resulting in a vacuum. This vacuum generates a strong adhesive force that keeps the platform 103 securely attached, preventing unintended movement during use.
[0027] A push button is installed with the handheld unit 101 and the platform 103 for activating and deactivating the device. The push button is accessed by the user for activating the device. When the user presses the push button, the electrical circuit is completed, which in response turns the device on. The push button is integrated with an actuator and a spring, which are automatically activated when pressed. They work together to move the internal contact, completing the circuit and allowing electrical current to flow, thereby activating the device.
[0028] When the push button is pressed, the button sends a signal (usually a change in voltage or current) to an inbuilt microcontroller associated with the device to either power up or shut down the device. Conversely, releasing the button allows the spring to return to its original position, breaking the circuit and sending the signal to deactivate the device. The microcontroller is pre-fed to detect this signal and respond accordingly.
[0029] The handheld unit 101 serves as the primary user-operable input for a computing unit. The user interacts with the computing unit by manipulating the handheld unit 101 (e.g., for cursor movement) and using its clickable button(s) 102.
[0030] Once activated, the microcontroller initiates a communication module, which is linked to the microcontroller to establish a wireless connection with the computing unit (including, but not limited to, a smartphone, tablet, or laptop). Through this connection, the user operates the handheld unit 101 to interact with the computing unit via the integrated user interface, enabling real-time input, control, and feedback.
[0031] The communication module used herein includes, but not limited to Wi-Fi (Wireless Fidelity) module, Bluetooth module, GSM (Global System for Mobile Communication) module. The communication module used herein is preferably a Wi-Fi module that is a hardware component that enables the microcontroller to connect wirelessly with the computing unit. The Wi-Fi module works by utilizing radio waves to transmit and receive data over short distances. The core functionality relies on the IEEE 802.11 standards, which define the protocols for wireless local area networking (WLAN). Once connected, the module allows the microcontroller to send and receive data through data packets. In an embodiment, the connection may be wired connection.
[0032] An actuator 106 is coupled with each of the click buttons of the handheld unit 101 to automate clicking operations upon receiving input from a voice command module 107, which is preferably a microphone integrated within the handheld unit 101 and a gesture recognition module integrated with an imaging unit 105 integrated within the handheld unit 101. For example, if the user says left click or move point finger, then the microcontroller actuates the actuator assembled with the left click button or if the user says right click or move middle finger, then actuates the actuator assembled with the right click button. The microphone contains a small diaphragm connected to a moving coil. When sound waves from the user strike the diaphragm, the coil vibrates accordingly. This movement causes the coil to oscillate within the magnetic field, generating an electrical current. The resulting signal is sent to the microcontroller for processing the user’s voice commands related to controlling clicking actions.
[0033] The actuator 106 used herein is a solenoid actuator 106 is used to automate clicking operations functions by converting electrical energy into controlled mechanical movement. When an electrical current is applied to the solenoid coil, it generates a magnetic field that causes a plunger or armature inside the actuator 106 to move linearly. This movement is harnessed to physically press the click button on the handheld unit 101, effectively simulating the user’s manual click. The timing and duration of the actuation are controlled by the microcontroller based on inputs received voice commands. Once the electrical signal ceases, a spring resets the plunger to its original position, allowing the click button to return to its default state. The plunger being mechanically assembled with the button enables the movement of the button. This enables precise, repeatable automated clicking without the user intervention, improving efficiency and allowing for automated control features.
[0034] The imaging unit 105 is connected to an analysis module integrated within the handheld unit 101 and is configured to continuously monitor screen activity and user behavior in real time. The imaging unit 105 includes an image capturing module comprising a set of lenses that capture multiple images of the surroundings of the handheld unit 101. The captured images are stored in the memory of the imaging unit 105 in the form of optical data. Additionally, the imaging unit 105 includes a processor embedded with artificial intelligence protocols. The processor performs essential image processing tasks such as noise reduction to improve image clarity, feature extraction to identify relevant characteristics of the user or environment (e.g., shape, color, size), and segmentation to isolate key elements from the background.
[0035] The extracted and processed data is then converted into digital pulses and bits and transmitted to the microcontroller. The microcontroller, in conjunction with the analysis module, processes the received data to detect user gestures, behavioral patterns, or other predefined conditions such as inappropriate content, including but not limited to the detection of inappropriate screen content. The analysis module plays a critical role in interpreting this data using trained AI models to assess user engagement and content relevance. Upon identifying undesirable or inappropriate content, the analysis module initiates a control signal to lock the actuator 106 coupled with the clickable buttons 102 of the handheld unit 101 to stop the clicking operations, thereby limiting user interaction. If the inappropriate screen content is detected, then the microcontroller wirelessly transmits an alert to a guardian’s computing device linked to the microcontroller on detection of such content.
[0036] The gesture recognition module is utilized to detect the position and movement of the user’s fingers or hand. The gesture recognition module operates by capturing and interpreting the user’s hand or finger movements to facilitate intuitive control of the handheld unit 101. Using the imaging unit’s camera, it continuously records real-time images or video of the user’s gestures within its field of view. These images are processed by the processor running artificial intelligence protocols trained to identify specific gesture patterns such as swipes, taps, or rotations. Key steps include detecting the position and movement of the user’s fingers or hand, extracting features like shape and motion trajectory, and classifying the gestures against a predefined set of commands. Once a gesture is recognized, the module converts it into corresponding control signals, which are transmitted to the microcontroller. The microcontroller then executes the appropriate action, such as clicking, or locking functions, based on the interpreted gesture.
[0037] In an embodiment of the present invention, a motorized ball and socket joint is installed between the imaging unit 105 and the handheld unit 101 to enable the detection of both the screen activity and user behavior in real time. The motorized ball and socket joint includes a motor powered by the microcontroller generating electrical current, a ball shaped element and a socket. The ball moves freely within the socket. The motor rotates the ball in various directions that is controlled by the microcontroller that further commands the motor to position the ball precisely. The microcontroller further actuates the motor to generate electrical current to rotate in the joint for providing movement to the imaging unit 105 for detecting the screen activity, as well as the user behavior in real time.
[0038] A gesture-controlled scrolling means 108 integrated into the handheld unit 101, operated through user gestures and improved by a deep learning protocol module integrated within scrolling means 108 that is trained to dynamically adapt scrolling speed in real time according to the velocity, frequency, and direction of user gestures captured by the imaging unit 113. The gesture-controlled scrolling means 108 integrated into the handheld unit 101 enables users to navigate through content by interpreting hand or finger movements without physical contact. This scrolling means 108 relies on the imaging unit 105 to capture real-time visual data of user gestures near or on the handheld unit’s surface. The captured images are processed by a deep learning protocols trained to recognize specific scrolling gestures, such as swiping up, down, or circular motions. Upon identifying a valid scrolling gesture, the protocols convert the gesture into corresponding digital commands. These commands are then sent to a motorized gear train arrangement 112 embedded within the handheld unit 101, which physically actuates the scrolling arrangement 112—rotating gears or rollers—to move the cursor or scroll content accordingly and the speed of scrolling is also depends on user gestures detected by the imaging unit 105.
[0039] A thermal regulation unit 109, comprising a temperature sensor 109a and a Peltier element, is integrated within the handheld unit 101 to maintain an optimal operating temperature of the cushion padding 110 layered over the handheld unit 101, ensuring user comfort during prolonged use. The temperature sensor 109a is typically composed of a metal or semiconductor material that generates a change in electrical voltage or resistance in response to temperature variations. It works by measuring the voltage across its terminals, and this voltage—affected by the resistance of the diode or sensing element—is converted into readable values representing the ambient temperature of the cushion padding 110 surface. The measured temperature is then converted into an electrical signal and transmitted to the microcontroller. The microcontroller processes this data, and compare the detected temperature with a predefined threshold stored in a linked database. If surrounding temperature is low, the microcontroller activates the hot side of the Peltier unit 109b is activated or if the surrounding temperature is high, then the cool side is activated.
[0040] The Peltier unit 109b consists of two semiconductor plates, known as Peltier plates, connected electrically in series and thermally in parallel, sandwiched between two ceramic layers. When an electric current is applied to the Peltier unit 109b, one side of the unit absorbs heat (creating a cooling effect), while the opposite side dissipates heat. This thermoelectric effect enables the microcontroller to either cool or warm the cushion padding 110 as needed to maintain the desired temperature range for user comfort.
[0041] The platform 103 is equipped with a ball transfer conveyor 111 that facilitates cursor movement when the handheld unit 101 reaches the physical boundary of the platform 103. This interaction is detected by a touch sensor positioned along the platform’s perimeter. The ball transfer conveyor 111 comprises multiple rotating ball elements embedded within the surface of the platform 103, designed to interface smoothly with the underside of the handheld unit 101, enabling seamless directional movement and control.
[0042] The ball transfer conveyor 111 consists of multiple rotating ball elements embedded into the surface of the platform 103, designed to facilitate smooth, multi-directional movement of the handheld unit 101. Each ball element is partially exposed above the platform 103 surface and rests within a housing that allows it to rotate freely in any direction. When the handheld unit 101 is moved across the platform 103 and comes into contact with the ball elements—particularly near the physical boundary detected by touch sensors—the balls support the underside of the unit, reducing friction and allowing for effortless gliding in response to the user's push or directional intent. As the handheld unit 101 moves over the rotating balls, these elements roll beneath it, translating physical movement into responsive cursor control on the connected computing interface. This conveyor 111 enables seamless and intuitive directional control while maintaining ergonomic stability, making it particularly suitable for precision navigation tasks.
[0043] The touch sensor operates by detecting changes in electrical resistance when pressure is applied to its surface. The sensor consists of two flexible, conductive layers separated by a small gap. When the handheld unit 101 presses against the sensor area along the platform’s boundary, the top conductive layer makes contact with the bottom layer, causing a change in resistance at the point of contact. This change is detected by the sensor’s circuitry and the resulting signal is sent to the microcontroller. The microcontroller processes this input to determine that the handheld unit 101 has reached the platform’s edge, activating the ball transfer conveyor 111 to enable seamless cursor movement beyond the physical boundary.
[0044] The ball transfer conveyor 111 is electronically linked to a rotation sensing unit that includes at least one rotary encoder or Hall-effect sensor, designed to measure the angular displacement and rotational speed of each ball element. This enables accurate tracking of both cursor movement and the position of the handheld unit 101.
[0045] The rotation sensing unit works by detecting and measuring the movement of individual ball elements within the ball transfer conveyor system. When the handheld device moves across the platform, the ball elements rotate in response to the direction and force applied. The rotary encoder or the Hall-effect sensor is placed in close proximity to these balls to capture their motion. The rotary encoder typically converts the rotational motion of the ball into electrical signals by detecting the position of a rotating disk attached to the ball, allowing the system to calculate both the angular displacement (how far the ball has turned) and the rotational speed (how fast it is turning) based on the frequency of the signal pulses. Alternatively, the Hall-effect sensor measures changes in magnetic fields caused by magnets embedded in or near the rotating ball; as the ball spins, the sensor detects fluctuations in the magnetic field, which are then processed to determine rotation angle and speed. By continuously monitoring these values, the microcontroller precisely track how the device is moving over the platform and translate that into accurate cursor control or spatial positioning on the screen.
[0046] The platform 103 is also equipped with a thermocouple that detects heat dissipated from the handheld unit 101. The thermocouple operates as a temperature sensor 109a by utilizing the thermoelectric effect to detect heat generated within the handheld unit 101. The thermoelectric consists of two dissimilar metal wires joined at one end, forming a junction known as the measurement or hot junction. When this junction experiences a temperature change due to heat from the handheld unit 101, it generates a small voltage proportional to the temperature difference between the hot junction and the other ends of the wires, called the reference or cold junction. This voltage signal is transmitted to the microcontroller, which interprets the magnitude of the voltage to determine the precise temperature of the handheld unit’s surface or its immediate environment. Based on the detected temperature, the microcontroller activates a cooling fan 115 mounted on the platform 103, to dissipate excess heat and maintain optimal operating conditions.
[0047] The cooling fan 115 operates by converting electrical energy into mechanical energy through a compact electric motor, typically powered by a direct current (DC) source. When triggered by a control signal from the microcontroller—based on temperature readings from the thermocouple—the motor begins to rotate a set of fan blades enclosed within a protective housing. As the blades spin, they generate a continuous stream of airflow directed toward the heated area, such as the surface of the handheld unit 101 or the surrounding platform 103. This airflow improves convective heat transfer by carrying away warm air and replacing it with cooler ambient air. As a result, the temperature of the device is reduced, preventing overheating and maintaining optimal operational performance. The fan is engineered for efficient, quiet operation to ensure it does not interfere with the user experience during prolonged device use.
[0048] A health monitoring unit 113 is embedded within the handheld unit 101 to monitor key physiological parameters such as heart rate, blood oxygen levels, and skin conductance using dedicated biosensors. This unit comprises an ECG sensor, a pulse oximeter sensor, and a galvanic skin response (GSR) sensor, all integrated onto the surface of the handheld unit 101.
[0049] The ECG sensor measures the electrical activity of the heart by detecting the electrical signals generated during each heartbeat, enabling accurate monitoring of heart rate and rhythm. The pulse oximeter sensor works by emitting light waves through the skin and measuring the amount of oxygenated and deoxygenated hemoglobin in the blood, thereby determining blood oxygen saturation levels. Meanwhile, the GSR sensor measures the skin’s conductance or electrical resistance, which varies with sweat gland activity and provides insight into the user’s stress and emotional state. The data from these sensors is continuously transmitted to the microcontroller, which analyzes and convert into meaningful health metrics. This real-time health information is then displayed or projected, allowing users to monitor their vital signs conveniently during device usage via a projection unit 114 installed with the handheld unit 101.
[0050] The projection unit 114 comprises a laser source, a collimator, a collimating lens, and a wedge-shaped lens. The laser source generates a laser light beam that passes through the collimator, which focuses the beam to a precise focal point. This focused laser beam is then received by the collimating lens, which further collimates the light beyond the focal point to ensure a parallel beam. The collimated laser beam is subsequently directed to a wedge-shaped lens, which has an aperture positioned between the focusing lens and the collimating lens. This aperture is sized to allow the laser beam to pass through a series of mirrors within the projection unit 114, ultimately projecting a high-quality laser image onto the designated surface. The projection of the laser beam onto the surface assists the user by providing clear, real-time visual feedback or health analytics during the handheld unit’s operation.
[0051] The present invention work best in the following manner, where the handheld unit 101 is ergonomically shaped to conform to the user’s palm and includes at least one clickable button 102 integrated with the platform 103 via the base equipped with the suction units 104 for surface stability. The imaging unit 105, linked to the analysis module, continuously monitors screen activity and user behavior in real time. The analysis module, trained using deep learning protocols, interprets gesture inputs and detects inappropriate content, triggering the actuator 106 to lock the click button and sending alerts to the guardian’s interface. The actuator 106, configured as the solenoid type, responds to both voice commands and gesture recognition, indicating user stress or aggression. The handheld unit 101 incorporates the gesture-controlled scrolling means 108 operated through the motorized gear train, which is actuated via recognized hand or finger gestures. The thermal regulation unit 109, composed of the temperature sensor 109a and the Peltier unit 109b, maintains optimal temperature of the cushion padding 110 for user comfort. The health monitoring unit 113, embedded in the handheld unit 101, comprises the ECG sensor, pulse oximeter sensor, and galvanic skin response sensor for tracking heart rate, blood oxygen levels, and skin conductance, with data displayed via the projection unit 114. The platform 103 integrates the ball transfer conveyor 111 with rotating ball elements and touch sensors for smooth cursor movement and uses the thermocouple to activate the cooling fan 115 to dissipate heat effectively.
[0052] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A multimodal ergonomic input device, comprising:
i) a handheld unit 101 having an ergonomic shape conforming to a user’s palm configured as a user-operable input means for a computing unit, the handheld unit 101 comprising at least one clickable button 102;
ii) a platform 103 adapted to support the handheld unit 101, the platform 103 having suction units 104 disposed at its bottom portion for adherence with a stable surface;
iii) a thermal regulation unit 109, operatively coupled with the handheld unit to dynamically regulate the temperature of the handheld unit according to ambient temperature;
iv) an imaging unit 105 linked with an analysis module integrated with the handheld unit 101 and configured to monitor screen activity and user behavior in real time;
v) at least one actuator 106 coupled with the click button of the handheld unit 101 to automate clicking operations upon receiving input from at least one of a voice command module 107 and gesture recognition module linked with the imaging unit 105 and lock clicking operation in case the screen activity or user behavior is identified as inappropriate; and
vi) a gesture controlled scrolling means 108 arranged on the handheld unit 101 and actuated based on user gestures and modulated by a deep learning protocol.
2) The device as claimed in claim 1, wherein the analysis module is trained with data corresponding to inappropriate activities and behavior.
3) The device as claimed in claim 1, wherein the thermal regulation unit 109 comprising a temperature sensor 109a and a Peltier unit 109b, arranged within the handheld unit 101 to adaptively maintain temperature of the cushion padding 110 fabricated over the handheld unit 101 according to the ambient temperature detected by the temperature sensor 109a.
4) The device as claimed in claim 1, wherein the platform 103 is integrated with a ball transfer conveyor 111 that enables cursor movement upon reaching a physical boundary of the platform 103 by the handheld unit 101, as detected via at least one touch sensor installed along the physical boundary of the platform 103.
5) The device as claimed in claim 5, wherein the ball transfer conveyor 111 comprises rotating ball elements embedded in the platform 103 surface and configured to interface with the handheld unit 101.
6) The device as claimed in claim 1, wherein the ball transfer conveyor 111 is electrically connected to a rotation sensing unit comprising at least one rotary encoder or Hall-effect sensor configured to detect angular displacement and rotation speed of individual ball elements, thereby enabling precise tracking of cursor movement or device position.
7) The device as claimed in claim 1, wherein the scrolling means 108 comprises a motorized gear train arrangement 112 located between click buttons and operable through hand or finger gestures recognized by the imaging unit 105.
8) The device as claimed in claim 1, wherein a health monitoring unit 113 embedded within the handheld unit 101 and configured to monitor at least heart rate, blood oxygen levels, and skin conductance via corresponding biosensors to provide real-time health analytics on a projection unit 114 installed with the handheld unit 101.
9) The device as claimed in claim 8, the gesture controlled scrolling means 108 comprises a deep learning based control module trained to dynamically adapt scrolling speed in real time according to the velocity, frequency, and direction of user gestures captured by the imaging unit 105.
10) The device as claimed in claim 1, wherein the platform 103 includes a thermocouple for detecting dissipated heat from the handheld unit 101 and initiates activation of cooling fan 115 mounted with the platform 103 to dissipate excess heat based on feedback from thermocouple.
| # | Name | Date |
|---|---|---|
| 1 | 202521059429-STATEMENT OF UNDERTAKING (FORM 3) [20-06-2025(online)].pdf | 2025-06-20 |
| 2 | 202521059429-REQUEST FOR EXAMINATION (FORM-18) [20-06-2025(online)].pdf | 2025-06-20 |
| 3 | 202521059429-REQUEST FOR EARLY PUBLICATION(FORM-9) [20-06-2025(online)].pdf | 2025-06-20 |
| 4 | 202521059429-PROOF OF RIGHT [20-06-2025(online)].pdf | 2025-06-20 |
| 5 | 202521059429-POWER OF AUTHORITY [20-06-2025(online)].pdf | 2025-06-20 |
| 6 | 202521059429-FORM-9 [20-06-2025(online)].pdf | 2025-06-20 |
| 7 | 202521059429-FORM FOR SMALL ENTITY(FORM-28) [20-06-2025(online)].pdf | 2025-06-20 |
| 8 | 202521059429-FORM 18 [20-06-2025(online)].pdf | 2025-06-20 |
| 9 | 202521059429-FORM 1 [20-06-2025(online)].pdf | 2025-06-20 |
| 10 | 202521059429-FIGURE OF ABSTRACT [20-06-2025(online)].pdf | 2025-06-20 |
| 11 | 202521059429-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-06-2025(online)].pdf | 2025-06-20 |
| 12 | 202521059429-EVIDENCE FOR REGISTRATION UNDER SSI [20-06-2025(online)].pdf | 2025-06-20 |
| 13 | 202521059429-EDUCATIONAL INSTITUTION(S) [20-06-2025(online)].pdf | 2025-06-20 |
| 14 | 202521059429-DRAWINGS [20-06-2025(online)].pdf | 2025-06-20 |
| 15 | 202521059429-DECLARATION OF INVENTORSHIP (FORM 5) [20-06-2025(online)].pdf | 2025-06-20 |
| 16 | 202521059429-COMPLETE SPECIFICATION [20-06-2025(online)].pdf | 2025-06-20 |
| 17 | 202521059429-FORM-26 [25-06-2025(online)].pdf | 2025-06-25 |
| 18 | Abstract.jpg | 2025-07-04 |