Abstract: An adjustable prosthetic arm, comprises of an upper part 104 comprising an elongated cylindrical telescopic member 101 having a cushioned concave flap 102 configured with an expandable pulley 103 affixing the member 101 onto a stub of a user, a lower part 105 comprises an elongated cylindrical telescopic structure 107 connected with the upper part 104 by means of a first pivot joint 106, a hand unit 108 attached with the lower part 105 by means of a second pivot joint 109 for providing dexterity to the user, hand unit 108 comprises a palm portion and a five elongated fingers configured with pin joints to simulate movements of a human hand, a laser sensor for detection diameter of stub and dimensions of the user, an imaging unit 110 to determine a type of object and hardness of the object to be gripped.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to an adjustable prosthetic arm that is designed to offer prosthetic arm support and adjustable capability in accordance to a user’s limb stub, ensuring a secured attachment and further assesses the type and hardness of objects the user wishes to grip, applying the necessary pressure to prevent damage to the object.
BACKGROUND OF THE INVENTION
[0002] The need for prosthetic arms has grown significantly due to increasing incidences of limb loss, whether from accidents, congenital conditions, or medical reasons like disease or amputation. Prosthetic arms provide essential support for individuals who have lost one or both upper limbs, offering them a chance to regain mobility, functionality, and independence. While traditional prosthetics focused mainly on cosmetic restoration, advancements in technology have enabled more functional and customizable prosthetic limbs that mimic the movement and dexterity of a natural arm. These modern prosthetics are crucial for restoring basic tasks, such as eating, dressing, and working, which are often challenging for individuals without a limb. Beyond everyday tasks, prosthetic arms also play a significant role in improving the emotional well-being of users by helping them reintegrate into social and professional settings.
[0003] Prosthetic arms have evolved with various technologies designed to replicate the function of a natural limb. Equipment for prosthetic arms includes myoelectric prostheses, which use electrical signals from the residual muscles to control the movement of the prosthetic. These devices are equipped with sensors that detect muscle contractions, allowing for sophisticated control of hand and arm movements. Additionally, body-powered prosthetics use cables and mechanical systems to enable movement through physical exertion. Advanced prosthetics also feature powered joints, sensors, and even artificial intelligence to improve functionality and adaptability. However, these prosthetic devices come with certain drawbacks. Myoelectric prosthetics, while offering better dexterity and control, can be expensive and require frequent maintenance. The complexity of their components may lead to malfunctions or require recalibration, especially in environments with dirt, moisture, or extreme temperatures. Body-powered prosthetics, on the other hand, are often more durable and cost-effective but may provide less natural movement and require more effort from the user. Both types of prosthetics can be uncomfortable for long-term wear, with issues like skin irritation or pressure sores. Moreover, many prosthetic arms still lack the sensitivity to perform intricate tasks or provide fine motor control, limiting their effectiveness in daily life. Additionally, access to high-quality prosthetics remains a challenge in many regions due to cost and availability.
[0004] US2015257903A1 discloses a system for powering a prosthetic arm is disclosed. The system includes at least one internal battery located in the prosthetic arm, at least one external battery connected to the prosthetic arm, and a master controller configured to connect either the at least one internal battery or the at least one external battery to a power bus to power the prosthetic arm.
[0005] US2008288088A1 discloses a prosthetic arm apparatus comprising a plurality of segments that provide a user of the prosthetic arm apparatus with substantially the same movement capability and function as a human arm.
[0006] Conventionally, many arms have been developed to provide prosthetic arm, however arms mentioned in the prior arts have limitations pertaining to recognition of sign language gestures from another person and converts them into spoken language, enabling better communication for the user.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop an arm that is capable of providing support to a user’s prosthetic arm, adjusting to fit the user’s limb stub securely, accordingly identifies the type and hardness of objects to be gripped, applying the right amount of pressure to avoid damaging them. The developed arm needs to store data about objects gripped and the user’s movements to enhance future gripping actions.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop an arm that is capable of providing a prosthetic arm support to a user and adjusts in accordance to stub of the user such that enables secured affixing with the user.
[0010] Another object of the present invention is to develop an arm that is capable of determining a type of object and hardness of the object to be gripped in accordance to the user requirement such that an appropriate pressure is applied while gripping the object in view of preventing any damage to the object.
[0011] Another object of the present invention is to develop an arm that is capable of storing information of objects gripped by the user and movement of the user for gripping the objects to improve gripping actuations of the arm.
[0012] Yet another object of the present invention is to develop an arm that is capable of detecting sign language of a person communicating with the user and accordingly translates the sign language into a spoken language and informs the user for effective communication.
[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0014] The present invention relates to an adjustable prosthetic arm that is capable of providing stable support for the user’s prosthetic arm, adjusting to fit the user’s limb stub securely and further identifies object types and hardness, applying precise pressure to grip without causing damage.
[0015] According to an embodiment of the present invention, an adjustable prosthetic arm, comprises of an upper part comprising an elongated cylindrical telescopic member having a cushioned concave flap configured with an expandable pulley disposed at an upper end of the part for affixing onto a stub of a user, a lower part connected with the upper part by means of a first pivot joint, the lower part comprises an elongated cylindrical telescopic structure having a hand unit attached with the lower part by means of a second pivot joint for providing dexterity to the user, the hand unit comprises a palm portion and a five elongated fingers configured with pin joints to simulate structure and movements of a human hand, a laser sensor embedded on the upper part for detection diameter of stub and dimensions of the user, to trigger a microcontroller to actuate the expandable pulley to expand or retract the flap for a securing the upper part against the stub and the member and the structure to expand or retract as per dimensions of the user.
[0016] According to another embodiment of the present invention, the proposed arm further comprises of an artificial intelligence-based imaging unit, installed on the lower part and integrated with a processor for recording and processing images in a vicinity of the lower part, in synchronisation with a tactile sensor embedded in the finger, to determine a type of object and hardness of the object to be gripped, a gesture sensor embedded in the upper part to detect sign language of a person communicating the user to trigger the microcontroller to actuate a speaker provided on the upper part to convert the sign language into spoken language for a reference of the user, and plurality of pneumatic pins are embedded in the fingers and the palm to provide an enhanced grip onto objects.
[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of an adjustable prosthetic arm.
DETAILED DESCRIPTION OF THE INVENTION
[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0022] The present invention relates to an adjustable prosthetic arm that is capable of offering prosthetic arm support and adjusting to a user’s limb stub for secured attachment, further identifies object characteristics such as type and hardness, applying appropriate gripping pressure to avoid harm and keeps track of objects gripped and the user's movements, enhancing future gripping accuracy.
[0023] Referring to Figure 1, an isometric view of an adjustable prosthetic arm is illustrated, comprises of an elongated cylindrical telescopic member 101 having a cushioned concave flap 102 configured with an expandable pulley 103 at an upper part 104 comprising, a lower part 105 connected with the upper part 104 by means of a first pivot joint 106, an elongated cylindrical telescopic structure 107 having a hand unit 108 attached with the lower part 105 by means of a second pivot joint 109, an artificial intelligence-based imaging unit 110 installed on the lower part 105, plurality of pneumatic pins 111 embedded in fingers and palm of the hand unit 108, and a speaker 112 provided on the upper part 104.
[0024] The proposed invention includes arm having a telescopic member 101 preferably in cylindrical shape at an upper part 104 incorporating various components associated with the arm, developed to be positioned on a stub of a user. The member 101 is pneumatically powered by a pneumatic arrangement associated with the arm providing extension/retraction of the member 101. The member 101 is configured with an expandable pulley 103 integrated with a cushioned concave flap 102 arranged within the member 101. The pulley 103 affixes the upper end of the arm with the stub of the user. The arm is made up of any material selected from but not limited to metal or alloy that ensures rigidity of the arm for longevity of the arm.
[0025] The user is required to access and presses a switch button arranged on the arm to activate the arm for associated processes of the arm. The switch button when pressed by the user, opens up an electrical circuit and allows currents to flow for powering an associated microcontroller of the arm for operating of all the linked components for performing their respective functions upon actuation.
[0026] The microcontroller, mentioned herein, is preferably an Arduino microcontroller. The Arduino microcontroller used herein controls the overall functionality of the components linked to it. The Arduino microcontroller is an open-source programming platform.
[0027] The upper part 104 is connected with a lower part 105 of the arm by means of a first pivot joint 106. The lower part 105 comprises an elongated cylindrical telescopic structure 107 which is pneumatically powered by a pneumatic arrangement associated with the arm providing extension/retraction of the structure 107.
[0028] After the activation of the arm, the user accesses a user interface which is installed in a computing unit linked with the microcontroller wirelessly by means of a wireless communication module. The user interface enables the user to provide input regarding affixing of the arm with the stub of the user. The communication module includes, but not limited to Wi-Fi (Wireless Fidelity) module, Bluetooth module, GSM (Global System for Mobile Communication) module.
[0029] Upon receiving of the user input, the microcontroller generates a command to activate an artificial intelligence-based imaging unit 110 integrated on the lower part 105 for capturing multiple images of the stub of the user for determining the dimension of the stub. The imaging unit 110 incorporates a processor that is encrypted with an artificial intelligence protocol. The artificial intelligence protocol operates by following a set of predefined instructions to process data and perform tasks autonomously. Initially, data is collected and input into a database, which then employs protocol to analyze and interpret the captured images. The processor of the imaging unit 110 via the artificial intelligence protocol processes the captured images and sent the signal to the microcontroller.
[0030] In sync with the imaging unit 110, a laser sensor embedded on the upper part 104 for detection diameter of stub and dimensions of the user. The laser sensor, used herein, is a measurement value recorder working with laser technology and turning the physical measured value into an analog electric signal. The laser sensor is conceived for contactless measurement which is based on the triangulation principle. Triangulation used for determining measurement by angle calculation where the sensor projects a laser spot on the stub. The reflected light falls incident onto a receiving unit at a certain angle depending on the distance and these received lights are converted into signals and sent to the microcontroller. The microcontroller then processes the received signals in order to determine dimensions of the stub of the user.
[0031] In accordance to the dimensions of the stub of the user, the microcontroller actuates the expandable pulley 103 of the member 101 for adjusting the upper end of the member 101 for accommodating the stub of the user. The pulley 103 is made with multiple cylinders, each cylinder is connected with the one section of the flap 102. All Cylinders are connected with a rotating liver placed at center with the gear set when the liver rotates cylinder expand in order to increase the size of the member 101 for affixing the member 101 onto the stub of the user.
[0032] The structure 107 of the lower part 105 is configured with a hand unit 108 comprises of a palm portion and a five elongated fingers which are configured with pin joints. The structure 107 provides dexterity to the user. The hand unit 108 serve to simulate structure 107 and movements of a human hand.
[0033] The command for gripping of an object is provided by the user from the computing unit. The imaging unit 110 works in sync with a tactile sensor embedded in the finger, to determine a type of object and hardness of the object to be gripped. Accordingly, the microcontroller actuates an air compressor and air valve associated with the pneumatic arrangement consisting of an air cylinder, air valve and piston which works in collaboration to aid in extension and retraction of the member 101 and structure 107.
[0034] The air valve associated with the pneumatic arrangements allows entry/exit of compressed air from the compressor. Then, the valve opens and the compressed air enters inside the cylinder thereby increasing the air pressure of the cylinder. The piston is connected to the member 101 and structure 107 and due to the increase in the air pressure, the piston extends. For the retraction of the piston, air is released from the cylinder to the air compressor via the valve. Thus, providing the required extension/retraction of the member 101 and structure 107 for positioning the lower part 105 in proximity to the object. All the pneumatically operated components associated with the arm comprises of the same type of pneumatic arrangement.
[0035] Simultaneously, the microcontroller actuates the first pivot joint 106 and the second pivot joint 109 in order to provide multi-directional movement to the structure 107 and the hand unit 108, respectively for gripping of the object.
[0036] The pivot joint allows the connected parts to rotate relative to each other around a central axis. The first and second pivot joint 109 s facilitate movement by providing a controlled range of motion. The joint consists of a pin or axle that fits into a hole or sleeve, allowing one part to rotate around it. The pivot joint may include bearings or lubricants to reduce friction and wear. By enabling angular movement, the pivot joint allows flexibility and adaptability, crucial for tasks requiring precision and dexterity, like gripping or manipulating objects.
[0037] The microcontroller then actuates the pin joints of the fingers to rotate the fingers to grip the object. The pin joint comprises of a ring and cylindrical portion that are linked with each other to provide rotational movement to the fingers. The ring is powered by a motor that is activated by the microcontroller to the rotate the ring to move the cylindrical portion due to which the fingers tilt. The motor is typically controlled by an electronic control unit that regulates its speed and direction. The joint consists of a hinge mechanism that enables rotation of the shaft that results in the rotational motion of the fingers enabling gripping movement for gripping the object.
[0038] In accordance to the gripping of the object, the tactile sensor monitors hardness of the object. The tactile sensor detects the hardness of the object by measuring the force of contact between the sensor and the object. The sensor is typically a small, flat component that is placed against the object and then pressed down. As the force of contact increases, the sensor measures the amount of pressure being applied and sends a signal to the microcontroller. The microcontroller then interprets the signal and determines the hardness of the object.
[0039] The fingers of the hand unit 108 is equipped with a pressure sensor such that monitors the applied pressure while gripping the object. The pressure sensor comprises of a sensing element known as diaphragm that experiences a force exerted by the hand unit 108 over the object. This force leads to deflection in the diaphragm that is measured and converted into an electrical signal which is sent to the microcontroller to prevent a damage to the object.
[0040] The fingers of the hand unit 108 are configured with multiple pneumatic pins 111 which are pneumatically powered. The extension/retraction of the pin is provided by a pneumatic arrangement associated with the arm. The working of the pins 111 is similar to the working of the member 101 and structure 107 as mentioned above.
[0041] The upper part 104 is embedded with a gesture sensor to detect sign language of a person communicating the user. The gesture sensor detects and interprets sign language by capturing hand movements, gestures, and positions through sensors such as infrared or capacitive touch technology. The sensor is typically connected to a machine learning protocol that processes these gestures into corresponding words or phrases. As the other user communicates using sign language, the sensor tracks the motions and translates them into digital data, which is then interpreted by the microcontroller to output text or speech. The sensor's accuracy depends on its ability to recognize specific hand shapes, movements, and the context of the signs, enabling real-time communication with the user.
[0042] In accordance to the detected sign language, the microcontroller converts the sign language into spoken language for a reference of the user. The microcontroller informs the user regarding the sign language via a speaker 112 mounted over the upper part 104.
[0043] The speaker 112 works by taking the input signal from the microcontroller, it then processes and amplifies the received signal through a series of equipment in a specific order within the speaker 112, and then sends the output signal in form of audio notification through the speaker 112 for informing the user regarding the sign language of the other user.
[0044] The routine pattern of gripping of object(s) by the user is monitored by the imaging unit 110 and stored into a database associated with the arm and linked with the microcontroller. The movement of the user for gripping the objects is stored in the database for further usage in terms of improving gripping actuations of the arm.
[0045] A battery (not shown in figure) is associated with the arm to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrodes named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the arm.
[0046] The present invention works best in the following manner, where the upper part 104 consisting of the telescopic cylindrical member 101 as disclosed in the present invention is fitted with the cushioned concave flap 102 and expandable pulley 103 at its end, allowing it to be securely attached to the user's stub. The lower part 105 is pivotally connected to the upper part 104 and contains the telescopic structure 107 that enhances flexibility, along with the hand unit 108 designed to simulate the human hand. The hand unit 108 includes the palm portion and five fingers with pin joints, enabling realistic movement. The laser sensor on the upper part 104 detects the diameter of the stub and the user’s dimensions, triggering the microcontroller to adjust the expandable pulley 103 and flap 102 to securely attach the arm. The imaging unit 110 and tactile sensors in the fingers work together to identify objects, assess their hardness, and allow the fingers to grip them effectively. The microcontroller actuates the pin joints in the fingers, while the pressure sensor ensures the grip is gentle enough to avoid damage to the object. Additionally, the arm features the gesture sensor for detecting sign language, the pneumatic arrangement for better grip, and the database to optimize gripping actions based on past interactions. The wireless module enables remote control via the computing unit, providing enhanced functionality for the user.
[0047] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , C , Claims:1) An adjustable prosthetic arm, comprising:
i) an upper part 104 comprising an elongated cylindrical telescopic member 101 having a cushioned concave flap 102 configured with an expandable pulley 103 disposed at an upper end of said part for affixing onto a stub of a user;
ii) a lower part 105 connected with said upper part 104 by means of a first pivot joint 106, wherein said lower part 105 comprises an elongated cylindrical telescopic structure 107 having a hand unit 108 attached with said lower part 105 by means of a second pivot joint 109 for providing dexterity to said user;
iii) said hand unit 108 comprises a palm portion and five elongated fingers configured with pin joints to simulate structure 107 and movements of a human hand;
iv) a laser sensor embedded on said upper part 104 for detection diameter of stub and dimensions of said user, to trigger a microcontroller to actuate said expandable pulley 103 to expand or retract said flap 102 for a securing said upper part 104 against said stub and said member 101 and said structure 107 to expand or retract as per dimensions of said user; and
v) an artificial intelligence-based imaging unit 110, installed on said lower part 105 and integrated with a processor for recording and processing images in a vicinity of said lower part 105, in synchronisation with a tactile sensor embedded in said finger, to determine a type of object and hardness of said object to be gripped by to trigger a microcontroller to actuate a said pin joints to rotate said fingers to grip said object, wherein a pressure sensor embedded in said fingers to regular a gripping pressure applied by said fingers to prevent a damage to said object.
2) The arm as claimed in claim 1, wherein a gesture sensor embedded in said upper part 104 to detect sign language of a person communicating said user to trigger said microcontroller to actuate a speaker 112 provided on said upper part 104 to convert said sign language into spoken language for a reference of said user.
3) The arm as claimed in claim 1, wherein a plurality of pneumatic pins 111 are embedded in said fingers and said palm to provide an enhanced grip onto objects.
4) The arm as claimed in claim 1, wherein a database, linked with said microcontroller, stores objects gripped by said user and movement of said user for gripping said objects to, improve gripping actuations of said arm.
5) The arm as claimed in claim 1, wherein a wireless communication module pushes provided within said upper part 104 and linked with said microcontroller, enables said user to operate said arm by connecting via a computing unit.
| # | Name | Date |
|---|---|---|
| 1 | 202421094377-STATEMENT OF UNDERTAKING (FORM 3) [01-12-2024(online)].pdf | 2024-12-01 |
| 2 | 202421094377-REQUEST FOR EXAMINATION (FORM-18) [01-12-2024(online)].pdf | 2024-12-01 |
| 3 | 202421094377-REQUEST FOR EARLY PUBLICATION(FORM-9) [01-12-2024(online)].pdf | 2024-12-01 |
| 4 | 202421094377-POWER OF AUTHORITY [01-12-2024(online)].pdf | 2024-12-01 |
| 5 | 202421094377-FORM-9 [01-12-2024(online)].pdf | 2024-12-01 |
| 6 | 202421094377-FORM FOR SMALL ENTITY(FORM-28) [01-12-2024(online)].pdf | 2024-12-01 |
| 7 | 202421094377-FORM 18 [01-12-2024(online)].pdf | 2024-12-01 |
| 8 | 202421094377-FORM 1 [01-12-2024(online)].pdf | 2024-12-01 |
| 9 | 202421094377-FIGURE OF ABSTRACT [01-12-2024(online)].pdf | 2024-12-01 |
| 10 | 202421094377-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [01-12-2024(online)].pdf | 2024-12-01 |
| 11 | 202421094377-EVIDENCE FOR REGISTRATION UNDER SSI [01-12-2024(online)].pdf | 2024-12-01 |
| 12 | 202421094377-EDUCATIONAL INSTITUTION(S) [01-12-2024(online)].pdf | 2024-12-01 |
| 13 | 202421094377-DRAWINGS [01-12-2024(online)].pdf | 2024-12-01 |
| 14 | 202421094377-DECLARATION OF INVENTORSHIP (FORM 5) [01-12-2024(online)].pdf | 2024-12-01 |
| 15 | 202421094377-COMPLETE SPECIFICATION [01-12-2024(online)].pdf | 2024-12-01 |
| 16 | Abstract.jpg | 2024-12-26 |
| 17 | 202421094377-FORM-26 [03-06-2025(online)].pdf | 2025-06-03 |