Sign In to Follow Application
View All Documents & Correspondence

Peripheral Pointing Tool Handling Device

Abstract: A peripheral pointing tool handling device is comprising, a housing 101 is provided to accommodate a pointing tool , a camera 102 and ultrasonic sensor mounted on it to analyze user hand gestures in real-time and measure hand-to-device distance, a motorized dual-axis slider 103 on the top surface, integrated with a suction pad 104, moves the pointing tool horizontally and vertically for precise control, an L-shaped extendable rod 105, attached via a motorized slider 106, holds a panel 107 at its tip through a motorized ball and socket joint 108, a first motorized roller 109 on the panel 107 manipulates the scroll wheel and simulates clicks, a wire management unit 110 with an L-shaped telescopic bar 110a and second motorized roller 110b coils the pointing tool wire, aided by a robotic arm 111 aligning the wire on the roller 110b grooves, a microphone 112 and voice recognition for instructions.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 June 2025
Publication Number
28/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Marwadi University
Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Inventors

1. G.Narendra
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
2. O.Jayanth
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
3. B.Raga Sudha
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
4. Dr. Sunil Lavadiya
Department of Information and Communication Technology, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to a peripheral pointing tool handling device that is capable of providing precise control, autonomous management, and maintenance of a pointing tool for improving efficiency, responsiveness, and ease of use during computer operations.

BACKGROUND OF THE INVENTION

[0002] Peripheral pointing tools, such as mouse, are essential for precise navigation and control in computing environments. However, users often face challenges related to physical handling, especially during extended use or in constrained or sterile environments where manual interaction is limited or undesirable. Cable entanglement, limited mobility, and workspace clutter further hinder efficiency and ergonomics. Individuals with physical impairments find traditional pointing tool s difficult or impossible to operate without assistance. Moreover, maintaining hygiene and preventing dust accumulation on input devices is critical in sensitive settings like laboratories or hospitals. These challenges highlight the need for an automated handling means that enable touch-free control, improve accessibility, and ensure consistent, hygienic, and efficient operation of peripheral pointing tools.

[0003] Traditional devices such as trackballs, touchpads, voice-controlled interfaces, and gesture-recognition means offer alternatives to traditional mice, aiming to improve user comfort and accessibility. However, each comes with limitations. Trackballs and touchpads often lack the precision of standard mice, while voice-controlled means inaccurate in noisy environments or unsuitable for complex navigation tasks. Gesture-based controls typically require a learning curve and suffer from recognition errors or limited functionality. Robotic arms used in industrial settings are generally not optimized for compact desktop use and lack integration with standard pointing tools. Moreover, few means address wire management, USB handling, or surface cleanliness, making them impractical for long-term, hands-free, or hygienic use in diverse environments.

[0004] US9703398B2 discloses about a pointing device using proximity sensing is described. In an embodiment, a pointing device comprises a movement sensor and a proximity sensor. The movement sensor generates a first data sequence relating to sensed movement of the pointing device relative to a surface. The proximity sensor generates a second data sequence relating to sensed movement relative to the pointing device of one or more objects in proximity to the pointing device. In embodiments, data from the movement sensor of the pointing device is read and the movement of the pointing device relative to the surface is determined. Data from the proximity sensor is also read, and a sequence of sensor images of one or more objects in proximity to the pointing device are generated. The sensor images are analyzed to determine the movement of the one or more objects relative to the pointing device.

[0005] US8526767B2 discloses about a state machine gesture recognition algorithm for interpreting streams of coordinates received from a touch sensor. The gesture recognition code can be written in a high level language such as C and then compiled and embedded in a microcontroller chip, or CPU chip as desired. The gesture recognition code can be loaded into the same chip that interprets the touch signals from the touch sensor and generates the time series data, e.g. a microcontroller, or other programmable logic device such as a field programmable gate array.

[0006] Conventionally, many devices are available in market for assisting in peripheral pointing tool handling, but these are primarily operated manually or address only specific functions. Further, no such devices have been developed that are capable of automatically managing pointing tools without manual intervention. Moreover, these devices lack features such as integrated wire management, automated USB insertion, hygiene maintenance, and automated gesture or voice-based control, making them unsuitable for hands-free and accessible use in specialized environments.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a device that is capable of handling peripheral pointing tools autonomously. This includes integrated gesture and voice control, wire management and USB insertion, ensuring hands-free, accessible, and efficient operation across diverse and constrained environments.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a device that is capable of providing a means that enables accurate, smooth, and responsive movement of a peripheral pointing tool, allowing users to control computing devices more effectively and precisely.

[0010] Another object of the present invention is to develop a device that is capable of allowing autonomous handling and control of a pointing tool for reducing the need for manual adjustments and improving overall ease of use during interaction with digital means.

[0011] Another object of the present invention is to develop a device that is capable of organizing and managing the wire connected to a pointing tool, helping to avoid tangling, clutter, and operational inconvenience during usage.

[0012] Yet, another object of the present invention is to develop a device that is capable of maintaining performance by autonomously detecting dust accumulation and removing it from the surface, ensuring long-term reliability and reducing the need for manual cleaning.

[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0014] The present invention relates to a peripheral pointing tool handling device that physically operate and manage a pointing tool through autonomous movement, user interaction sensing, and environmental adaptation, enabling hands-free control, precise positioning, and cable management without manually handling the tool, improving convenience, accessibility, and hygiene.

[0015] According to an embodiment of the present invention, a peripheral pointing tool handling device comprises of a housing installed for accommodating a pointing tool , a camera integrated with an ultrasonic sensor mounted on the housing for analyzing hand gestures of the user in real-time, and measuring distance between the user’s hand and the housing, a motorized dual-axis slider provided over the top surface and integrated with a suction pad, the slider configured to move the pointing tool placed thereon along horizontal and vertical axes for precise directional control, a L-shaped extendable rod attached to the housing via a motorized slider, the rod having a panel fixed at the tip via a motorized ball and socket joint, a first motorized roller provided with a bottom portion of the panel to manipulate scroll wheel of the pointing tool and mimic pointing tool button clicks, a wire management unit comprising an L-shaped telescopic bar integrated with a second motorized roller attached with a lateral side of the body, configured to roll wire of the pointing tool onto the second roller, a robotic arm attached to the housing configured align the wire over the grooves carved over the second roller for coiling of the wire over the second roller, the pointing tool mentioned herein is a wired or wireless mouse, a microphone is integrated with the housing for receiving voice commands of the user he user for operating the pointing tool , a voice recognition module integrated with the microcontroller processes the received voice commands to initiate, control, or halt pointing tool functions.

[0016] According to another embodiment of the present invention, the device further comprises of robotic arm configured to insert the pointing tool ’s USB (Universal Serial Bus) connector automatically into a computer or laptop, as per user’s requirement, a dust sensor is integrated into the housing configured to detect dust accumulation on the top surface, and upon detection the microcontroller actuates a motorized suction unit provided with the housing to collect dust and channel into a chamber integrated with the suction unit, the camera utilizes gesture recognition protocols to detect and interpret a plurality of hand gestures for precise pointing tool control, the microcontroller adjusts the suction force acting over the pointing tool dynamically based on real-time user input to ensure smooth and responsive pointing tool movement, the motorized ball and socket joint of the expandable rod allows flexible positioning of the plate for user interaction including scrolling and clicking, the dual-axis slider provides movement precision sufficient to replicate traditional pointing tool movement across the computing device, a plurality of suction cups are installed underneath the housing to securely adhere the housing on the flat surface.

[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a peripheral pointing tool handling device.

DETAILED DESCRIPTION OF THE INVENTION

[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0022] The present invention relates to a peripheral pointing tool handling device that is capable of autonomously controlling and manipulating a user's input device. The present device also precisely manages movement, facilitates interaction through gestures and voice, and handles connectivity and maintenance, offering seamless and adaptable control for diverse computing tasks.

[0023] Referring to Figure 1, an isometric view of a peripheral pointing tool handling device is illustrated, comprising a housing 101 installed for accommodating a pointing tool , a camera 102 mounted on the housing 101, a motorized dual-axis slider 103 provided over the top surface and integrated with a suction pad 104, a L-shaped extendable rod 105 attached to the housing 101 via a motorized slider 106, a panel 107 fixed at the tip of the rod 105 via a motorized ball and socket joint 108, a first motorized roller 109 provided with a bottom portion of the panel 107, a wire management unit 110 comprising an L-shaped telescopic bar 110a integrated with a second motorized roller 110b attached with a lateral side of the body, a robotic arm 111 attached to the housing 101, a microphone 112 is integrated with the housing 101, a motorized suction unit 113 provided with the housing 101, a chamber 114 integrated with the suction unit 113, and a plurality of suction cups 115 are installed underneath the housing 101.

[0024] The device disclosed herein includes a housing 101 is positioned on a flat surface to accommodates a pointing tool. The pointing tool mentioned herein is a wired or wireless mouse, positioned by a user. The housing 101 used herein includes all necessary component of the device.

[0025] A plurality of suction cups 115 is installed on the underside of the housing 101 to securely adhere the device to a flat surface. The suction cups 115 used herein are made up of silicone rubber that easily eliminates pressure inside the suction cup and creating a vacuum between the cup and the surface which creates an air-tight seal, resisting any slipping of the housing 101 in order to adhere the housing 101 on the surface. The suction cups 115 are engineered for easy release and reattachment without losing their suction power, allowing for convenient repositioning of the housing 101 when necessary.

[0026] A push button is equipped with the device and installed on the housing 101 for activating and deactivating the device. The push button is accessed by the user for activating the device. When the user presses the push button, the electrical circuit is completed, which in response turns the device on. The push button is integrated with an actuator and a spring, which are automatically activated when pressed. They work together to move the internal contact, completing the circuit and allowing electrical current to flow, thereby activating the device.

[0027] When the push button is pressed, the button sends a signal (usually a change in voltage or current) to an inbuilt microcontroller associated with the device to either power up or shut down the microcontroller. Conversely, releasing the button allows the spring to return to its original position, breaking the circuit and sending the signal to deactivate the device. The microcontroller is pre-fed to detect this signal and respond accordingly. The microcontroller used herein is pre-fed using artificial intelligence and machine learning protocols to coordinate the working of the device.

[0028] A camera 102 is mounted on the housing 101 to capture multiple images of the surrounding environment for real-time analysis of the user’s hand gestures. The camera 102 includes an image capturing module with a set of lenses that acquire images around the pointing tool. These captured images are stored in the camera’s memory as optical data. The camera 102 is also paired with a processor fed with artificial intelligence protocols and gesture recognition protocols. These protocols analyze the captured images by performing essential image processing tasks such as noise reduction to improve image clarity, feature extraction to identify relevant characteristics of the user’s hand (e.g., shape, color, size), and segmentation to isolate the hand from the background. The gesture recognition protocols process this data to detect and interpret a plurality of hand gestures with high precision. The processed data is then converted into digital signals and transmitted to the microcontroller, which uses the interpreted gestures to enable precise control of the pointing tool, allowing intuitive and responsive user interaction.

[0029] An ultrasonic sensor is embedded within the housing 101 and operates in conjunction with the camera 102 to accurately measure the distance between the user’s hand and the housing 101 for enabling precise gesture recognition and responsive control of the pointing tool. The ultrasonic sensor works by emitting ultrasonic waves and measuring the time taken by these waves to bounce back after hitting the surface of the user’s hand. The ultrasonic sensor includes two main parts: a transmitter and a receiver, for emitting and detecting the waves to measure distance or detect motion. The transmitter sends a short ultrasonic pulse toward the surface of the user’s hand, which propagates through the air at the speed of sound and reflects back as an echo once it hits the hand. The receiver then detects the reflected echo from the surface, and a calculation is performed by the sensor based on the time interval between sending the signal and receiving the echo to determine the distance of the hand from the housing 101. The determined data is sent to the microcontroller in signal form, based on which the microcontroller further processes the signal to interpret gesture proximity and initiate corresponding pointing tool actions.

[0030] A motorized dual-axis slider 103 is mounted on the top surface of the housing 101 and integrated with a suction pad 104. The slider 103 is configured to move the pointing tool placed on it along both horizontal and vertical axes, enabling precise directional control. The suction pad 104 is strategically used for securely holding the pointing tool in place during operation.

[0031] The suction pad 104 operates by creating a partial vacuum between its surface and the base of the pointing tool, generating sufficient negative pressure to maintain a firm grip without causing damage or restricting movement. This allows the pointing tool to be maneuvered accurately along the horizontal and vertical axes without slipping. A suction pump of the suction pad 104 is controlled by the microcontroller, which dynamically adjusts the suction force in real time based on user input, such as gesture commands or voice instructions. This ensures that the grip on the pointing tool remains stable during fast directional changes, yet responsive enough to allow subtle adjustments and smooth tracking.

[0032] The motorized dual-axis slider 103 is configured to move the pointing tool placed on it along both horizontal and vertical axes to achieve precise directional control. The dual-axis slider 103 is designed to provide movement in two axes simultaneously, allowing for side-to-side (X-axis) and up-and-down (Y-axis) motion. Structurally, the slider comprises a pair of sliding rails assembled perpendicular to each other, forming an orthogonal configuration. These rails are actuated by electric motors in combination with precision gear arrangement that translate the movement of a central block on which the suction pad 104 attached—on which the pointing tool is mounted. When activated, the gear arrangement moves the block in both directions independently or concurrently, enabling smooth and accurate navigation across a digital interface. The microcontroller determines movement directions based on the user gestures, voice commands, or hand position, and initiates actuation accordingly. The dual-axis slider 103 provides movement precision sufficient to replicate traditional pointing tool movement across the computing device, thereby ensuring a seamless and intuitive user experience.

[0033] A microphone 112 is integrated with the housing 101 for receiving voice commands from the user to operate the pointing tool. A voice recognition module integrated with the microcontroller processes the received voice commands to initiate, control, or halt pointing tool functions. The microphone 112 includes a small diaphragm connected to a moving coil. When the user’s voice generates sound waves that strike the diaphragm, the coil vibrates in response. This vibration causes the coil to move back and forth within a magnetic field, inducing an electrical current. The generated electrical signals are transmitted to the microcontroller, where they are analyzed and interpreted as specific voice commands. These commands include instructions for directional movement, clicking, scrolling, adjusting suction force, or activating or deactivating specific functions of the pointing tool.

[0034] The voice recognition module functions as a key component for enabling hands-free control of the pointing tool. The module operates by capturing audio input from the user through the microphone 112 and converting the analog sound waves into digital signals. These signals are then processed using signal processing protocols to filter out background noise and isolate the user’s voice. The module employs techniques such as feature extraction—identifying unique vocal characteristics like pitch, tone, and frequency patterns—to match the input against a pre-defined set of voice commands stored in a linked database. Once a match is detected, the corresponding digital instruction is generated and sent to the microcontroller. The microcontroller then executes the appropriate action, such as moving the pointing tool, initiating a click, adjusting suction force, or stopping an ongoing function.

[0035] A L-shaped extendable rod 105 is attached to the housing 101 via a motorized slider 106, to extend or retract as per the requirement. At the tip of the rod 105, a panel 107 is mounted and connected through a motorized ball-and-socket joint 108, enabling flexible multi-directional positioning of the panel 107 to facilitate user interaction. Integrated into the bottom portion of this panel 107 is a first motorized roller 109, which is configured to mechanically interact with the scroll wheel and buttons of the pointing tool. This setup allows the device to mimic scrolling actions and perform button clicks with precision.

[0036] The motorized slider 106 typically consists of a motorized carriage attached to a rail for enabling the controlled linear movement of the L-shaped extendable rod 105. Upon actuation of the motorized slider 106 by the microcontroller, the motor drives the carriage along the rail, facilitating smooth and precise sliding motion of the rod 105. This controlled extension and retraction of the rod 105 allows the attached panel 107 to be accurately positioned in space for interacting with the pointing tool, ensuring coordinated operation with the motorized ball-and-socket joint 108 and the first motorized roller 109 mounted on the panel 107.

[0037] The extension and retraction of the L-shaped extendable rod 105 is powered pneumatically under the control of the microcontroller by employing a pneumatic unit associated with the rod 105. This pneumatic unit includes components such as an air compressor, air cylinders, air valves, and a piston, which work in coordination to facilitate the linear movement of the rod 105. The microcontroller operates the pneumatic unit by actuating the valve to allow the passage of compressed air from the compressor into the cylinder. The compressed air generates pressure against the piston, causing it to extend outward. Since the piston is mechanically connected to the L-shaped extendable rod 105, the applied pressure results in the forward extension of the rod 105. To retract the rod 105, the microcontroller closes the valve, releasing the pressure and allowing the piston to return to its original position, thereby retracting the rod 105. Thus, the microcontroller precisely regulates the extension and retraction of the L-shaped rod 105 in order to accurately position the attached panel 107 for optimal interaction with the pointing tool.

[0038] The motorized ball-and-socket joint 108 includes a motor powered and controlled by the microcontroller, a ball-shaped element, and a corresponding socket that houses the ball. The ball is designed to move freely within the socket, enabling multi-directional articulation. The motor, upon receiving commands from the microcontroller, generates the necessary electrical current to rotate the ball within the socket along various axes. This allows the attached panel 107 to be oriented precisely as required for effective interaction with the pointing tool. The microcontroller dynamically controls the motor to adjust the ball’s position in real time, thereby enabling the panel 107 to tilt, rotate, or align at different angles. This movement ensures that the panel 107 adapts to different operational modes such as scrolling, clicking, or reorienting based on the user input or positional requirements.

[0039] The first motorized roller 109 functions through a motor connected to a cylindrical roller via a rotating shaft. When actuated by the microcontroller, the motor generates rotational motion that is transmitted through the shaft to the roller, causing it to rotate. This controlled rotation enables the roller 109 to perform specific mechanical interactions, such as scrolling or simulating button clicks on the pointing tool, depending on its placement and function.

[0040] A wire management unit 110 comprising an L-shaped telescopic bar 110a integrated with a second motorized roller 110b, mounted on a lateral side of the housing 101 to wind the wire of the pointing tool onto the roller 110b for efficient and organized cable handling. The extension/retraction of the bar 110a is regulated by the microcontroller by in the same manner as the extendable rod 105, by employing the pneumatic unit. This pneumatic unit comprising an air compressor, air cylinders, valves, and pistons enables smooth and controlled linear movement of the telescopic bar 110a.

[0041] In coordination with the telescopic bar 110a, a robotic arm 111 attached to the housing 101 is actuated to grip, guide and align the pointing tool ’s wire over a plurality of grooves carved on the surface of the second motorized roller 110b. The microcontroller synchronizes the pneumatic actuation of the telescopic bar 110a, the rotation of the roller 110b, and the movement of the robotic arm 111 to ensure precise, tangle-free wire coiling and efficient cable management during both operation and storage. The robotic arm 111 comprises a robotic link and a clamp attached at its end. The robotic link consists of multiple interconnected segments joined together by articulated joints, also referred to as axes. Each joint houses a stepper motor that enables controlled rotation, allowing the arm 111 to execute complex and precise movements. Upon receiving a command from the microcontroller, these stepper motors actuate to drive the movement of the robotic link and position the clamp. The clamp, in turn, grips and guides the wire accurately over the roller’s grooves, ensuring uniform and organized coiling of the pointing tool ’s wire.

[0042] The second motorized roller 110b operates in the same manner as the first motorized roller 109 described above, utilizing a motor-driven shaft to generate rotational motion for coiling the pointing tool ’s cable in an organized and controlled manner. The robotic arm 111 is configured to automatically insert the USB (Universal Serial Bus) connector of the pointing tool into a computer or laptop in accordance with the user’s command or requirement.

[0043] A dust sensor is integrated into the housing 101 and is configured to detect dust accumulation on the top surface of the device. The dust sensor operates based on an optical sensing method and consists of a photosensor and an infrared light-emitting diode (IR LED). The IR LED emits infrared rays onto the top surface of the device, and the photosensor receives the rays that are reflected back from that surface. The intensity and pattern of the reflected IR rays are analyzed by the microcontroller to determine the presence and level of dust accumulation. Based on the detected dust levels, the microcontroller actuates a motorized suction unit 113 integrated into the housing 101 to collect the dust particles and channel them into a chamber 114 installed with the suction unit 113 for storage and later disposal, thus ensuring the cleanliness and optimal performance of the device.

[0044] The suction unit 113 typically consists of a suction pump, conduit, and suction catheter for withdrawing dust and debris. The pump generates negative pressure, creating a vacuum within the unit. The conduit connects the pump to the chamber 114, where the withdrawn dust is collected and stored. The suction catheter is used to reach the desired area on the top surface of the device for withdrawing accumulated dust particles. Upon activation of the suction unit 113 by the microcontroller, the pump creates a pressure differential, enabling the dust and debris to flow through the conduit into the dust collection chamber 114.

[0045] Lastly, a battery (not shown in figure) is associated with the device to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrode named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the device.

[0046] The present invention work best in the following manner, where the housing 101 accommodating the pointing tool, is integrated with the camera 102 and ultrasonic sensor mounted on the housing 101 for real-time analysis of hand gestures and measurement of distance between the user’s hand and the housing 101. The multiple suction cups 115 installed underneath the housing 101 securely adhere the device to flat surfaces, while the microcontroller dynamically adjusts suction force on the pointing tool for smooth and responsive operation. The motorized dual-axis slider 103 positioned on the top surface includes the suction pad 104 configured to move the pointing tool along horizontal and vertical axes for providing precise directional control replicating traditional pointing tool movement. Once positioned on the desired location, the L-shaped extendable rod 105 attached via the motorized slider 106 features the panel 107 fixed at the tip through the motorized ball and socket joint 108, allowing flexible positioning for the user interaction including scrolling and clicking. The first motorized roller 109 located at the bottom of the panel 107 manipulates the scroll wheel and mimics button clicks of the pointing tool. The wire management unit 110 comprising the L-shaped telescopic bar 110a integrated with the second motorized roller 110b is mounted on the lateral side of the housing 101, to coil the pointing tool ’s wire onto the roller 110b. The robotic arm 111 attached to the housing 101 coordinates with the telescopic bar 110a to align the wire over grooves on the second roller 110b, enabling organized coiling of the wire. The robotic arm 111 also autonomously inserts the USB connector of the pointing tool into the computer or laptop upon user command. The integrated microphone 112 receives voice commands processed by the voice recognition module in the microcontroller to control device functions. the dust sensor based on optical sensing detects dust accumulation on the top surface, triggering the microcontroller to actuate the motorized suction unit 113 that collects dust into the integrated chamber 114.

[0047] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A peripheral pointing tool handling device, comprising:
i) a housing 101 installed for accommodating a pointing tool;
ii) a camera 102 synced with an ultrasonic sensor mounted on the housing 101 for analyzing hand gestures of the user in real-time, and measuring distance between the user’s hand and the housing 101;
iii) a motorized dual-axis slider 103 provided over the top surface and integrated with a suction pad 104, where the slider 103 is configured to move the pointing tool placed thereon along horizontal and vertical axes for precise directional control;
iv) a L-shaped extendable rod 105 attached to the housing 101 via a motorized slider 106, and having a panel 107 fixed at the tip via a motorized ball and socket joint 108 to facilitate movement of scroll wheel of the pointing tool;
v) a first motorized roller 109 provided with a bottom portion of the panel 107 to manipulate scroll wheel of the pointing tool and mimic pointing tool button clicks;
vi) a wire management unit 110 comprising an L-shaped telescopic bar 110a integrated with a second motorized roller 110b attached with a lateral side of the body, configured to roll wire of the pointing tool onto the second roller 110b; and
vii) a robotic arm 111 attached to the housing 101 configured to align the wire over the grooves carved over the second roller 110b for coiling of the wire over the second roller 110b.

2) The device as claimed in claim 1, wherein the pointing tool mentioned herein is a wired or wireless mouse.

3) The device as claimed in claim 1, wherein a microphone 112 is integrated with the housing 101 for receiving voice commands from the user for operating the pointing tool, wherein a voice recognition module is integrated with the microcontroller processes the received voice commands to initiate, control, or halt pointing tool functions.

4) The device as claimed in claim 1, wherein the robotic arm 111 is configured to insert the pointing tool ’s USB (Universal Serial Bus) connector automatically into a computer or laptop, as per user’s requirement.

5) The device as claimed in claim 1, wherein a dust sensor is integrated into the housing 101 to detect dust accumulation on the top surface, and upon detection the microcontroller actuates a motorized suction unit 113 provided with the housing 101 to collect dust and channel into a chamber 114 paired with the suction unit 113.

6) The device as claimed in claim 1, wherein the camera 102 utilizes gesture recognition protocols to detect and interpret a plurality of hand gestures for precise pointing tool control.

7) The device as claimed in claim 1, wherein the microcontroller adjusts the suction force acting over the pointing tool dynamically based on real-time user input to ensure smooth and responsive pointing tool movement.

8) The device as claimed in claim 1, wherein the motorized ball and socket joint 108 of the expandable rod 105 allows flexible positioning of the plate for user interaction including scrolling and clicking.

9) The device as claimed in claim 1, wherein the dual-axis slider 103 provides movement precision sufficient to replicate traditional pointing tool movement across the computing device.

10) The device as claimed in claim 1, wherein a plurality of suction cups 115 is installed underneath the housing 101 to securely adhere the housing 101 on the flat surface.

Documents

Application Documents

# Name Date
1 202521059428-STATEMENT OF UNDERTAKING (FORM 3) [20-06-2025(online)].pdf 2025-06-20
2 202521059428-REQUEST FOR EXAMINATION (FORM-18) [20-06-2025(online)].pdf 2025-06-20
3 202521059428-REQUEST FOR EARLY PUBLICATION(FORM-9) [20-06-2025(online)].pdf 2025-06-20
4 202521059428-PROOF OF RIGHT [20-06-2025(online)].pdf 2025-06-20
5 202521059428-POWER OF AUTHORITY [20-06-2025(online)].pdf 2025-06-20
6 202521059428-FORM-9 [20-06-2025(online)].pdf 2025-06-20
7 202521059428-FORM FOR SMALL ENTITY(FORM-28) [20-06-2025(online)].pdf 2025-06-20
8 202521059428-FORM 18 [20-06-2025(online)].pdf 2025-06-20
9 202521059428-FORM 1 [20-06-2025(online)].pdf 2025-06-20
10 202521059428-FIGURE OF ABSTRACT [20-06-2025(online)].pdf 2025-06-20
11 202521059428-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-06-2025(online)].pdf 2025-06-20
12 202521059428-EVIDENCE FOR REGISTRATION UNDER SSI [20-06-2025(online)].pdf 2025-06-20
13 202521059428-EDUCATIONAL INSTITUTION(S) [20-06-2025(online)].pdf 2025-06-20
14 202521059428-DRAWINGS [20-06-2025(online)].pdf 2025-06-20
15 202521059428-DECLARATION OF INVENTORSHIP (FORM 5) [20-06-2025(online)].pdf 2025-06-20
16 202521059428-COMPLETE SPECIFICATION [20-06-2025(online)].pdf 2025-06-20
17 202521059428-FORM-26 [25-06-2025(online)].pdf 2025-06-25
18 Abstract.jpg 2025-07-04