Abstract: The present disclosure relates to a system for controlling exoskeleton. The system includes an eyewear operably coupled with the exoskeleton. The eyewear includes an imaging device, a sensing device operatively coupled with the imaging device. The imaging device and the sensing device are configured to monitor one or more eye parameter of a user wearing the eyewear, and correspondingly generate a set of first signals. One or more processing unit is operatively coupled with the eyewear. The one or more processing unit includes a processor associated with a memory, and the one or more processing unit configured to receive the set of first signals from the eyewear. Extract the one or more eye parameters from the received set of first signals. The one or more processing unit is configured to match the extracted one or more eye parameters with pre-defined parameters stored in a dataset, and accordingly generate a set of control signals representing the movement of one or more limbs of the exoskeleton.
[0001] The present disclosure relates to the field of exoskeleton, and more particularly the present disclosure relates to controlling exoskeleton with eye parameters.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Exoskeletons are structures of rigid links mounted on a body that restore, rehabilitate, or enhance the human motor function. Effective use of exoskeletons for restoration or enhancement of motor function has potentially widespread applications in areas such as rehabilitation robotics, injury prevention, performance enhancement, and in helping humans with disabilities or compromised neuromuscular function. However, conventional exoskeleton systems are at an early stage of development, with some progress having been made in the field of rehabilitation.
[0004] Conventional methods of operating the exoskeletons include brain reading techniques to detect the pulses sent out by the brain for desired movements. But no such sensor or instrument is currently available to perfectly identify the desired movement of a person we can read the type of brain wave (alpha, beta, etc.) but extracting information of the desired task from that data is a very difficult and cumbersome process as a lot of computing is required and after that the results are not very accurate. The prediction of desired action is not very accurate in the process EEG which is the main requirement in an exoskeleton suit designed to support the user. There also exist some other techniques such as the use of electrodes embedded inside the human skin in order to obtain more precise data but this is very harmful as it damages the brain tissues which may cause a temporary or permanent memory loss. Also, the above described methods involve very complex computations and thus increase the cost of exoskeleton.
[0005] There is, therefore, a requirement to have a technique of controlling the movement of different limbs of the exoskeleton at a reduced cost and complexity.
OBJECTS OF THE PRESENT DISCLOSURE
[0006] It is an object of the present disclosure to provide a system for controlling the exoskeleton using eye parameter.
[0007] It is an object of the present disclosure to provide a system for controlling the exoskeleton at reduced cost.
[0008] It is an object of the present disclosure to provide a system for controlling the exoskeleton at reduced complexity.
SUMMARY
[0009] The present disclosure relates to the field of exoskeleton, and more particularly the present disclosure relates to controlling exoskeleton with eye parameters.
[0010] An aspect of the present disclosure relates to a system for controlling exoskeleton. The system includes an eyewear operably coupled with the exoskeleton. The eyewear includes an imaging device, a sensing device operatively coupled with the imaging device. The imaging device and the sensing device are configured to monitor one or more eye parameter of a user wearing the eyewear, and correspondingly generate a set of first signals.One or more processing unit is operatively coupled with the eye wear. The one or more processing unit includes a processor associated with a memory, and the one or more processing unit configured to receive the set of first signals from the eyewear. Extract the one or more eye parameters from the received set of first signals. The one or more processing unit is configured to match the extracted one or more eye parameters with pre-defined parameters stored in a dataset, and accordingly generate a set of control signals representing the movement of one or more limbs of the exoskeleton.
[0011] In an Aspect, the eye parameters may include any or combination of eyelid blink rate, gaze time, and eyeball movement rate.
[0012] In an Aspect, the exoskeleton may include one or more actuating devices operatively coupled with the one or more processing unit, and may be configured to move, based on the set of control signals, the one or more limbs of the exoskeleton.
[0013] In an Aspect, the one or more actuating device may include any or combination of electromagnetic actuator, magnetic actuator, and pneumatic actuator.
[0014] In an Aspect, the imaging device may include a camera.
[0015] In an Aspect, the sensing device may include an infrared (IR) sensor.
[0016] In an Aspect, the system may include a communication unit configured to communicatively couple the eyewear, one or more processing unit, and the exoskeleton.
[0017] In an Aspect, the communication unit may include any or combination of micro USB connector, Bluetooth module, WiFi module, and GSM module.
[0018] In another aspect the present disclosure relates to a method of controlling exoskeleton using eyes. The method includes detecting, by an imaging device, and a sensing device, one or more eye parameters of a user. Generating, by the imaging device and the sensing device, a set of signals based on detected the one or more eye parameters. Receiving, by one or more processing unit operatively coupled with the imaging device and the sensing device, the set of first signals. Extracting, by the one or more processing unit, the one or more eye parameters from the set of first signals. Matching, by the one or more processing device, the extracted one or more parameters with a pre-defined parameters stored in a dataset. Generating, by the one or more processing device, a set of control signals representing the movement of one or more limbs of the exoskeleton. Moving, by one or more actuating devices operatively coupled with the one or more processing units, the one or more limbs of the exoskeleton based on received the set of control signals.
[0019] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
BRIEF DESCRIPTION OF DRAWINGS
[0020] The accompanying drawings are included to provide a further understanding of the present disclosure and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0021] FIG. 1 illustrates an exemplary block diagram of proposed system for controlling the Exoskeleton, in accordance with an embodiment of the present disclosure.
[0022] FIG. 2 illustrates an exemplary system for controlling the Exoskeleton, in accordance with an embodiment of the present disclosure.
[0023] FIG. 3 illustrates an exemplary method for controlling the Exoskeleton, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0024] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0025] If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0026] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0027] Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. These exemplary embodiments are provided only for illustrative purposes and so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those of ordinary skill in the art. The invention disclosed may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Various modifications will be readily apparent to persons skilled in the art. The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Moreover, all statements herein reciting embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure). Also, the terminology and phraseology used is for the purpose of describing exemplary embodiments and should not be considered limiting. Thus, the present invention is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed. For purpose of clarity, details relating to technical material that is known in the technical fields related to the invention have not been described in detail so as not to unnecessarily obscure the present invention.
[0028] The use of any and all examples, or exemplary language (e.g., “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non – claimed element essential to the practice of the invention.
[0029] The present disclosure relates to the field of exoskeleton, and more particularly the present disclosure relates to controlling exoskeleton with eye parameters.
[0030] An embodiment of the present disclosure elaborates upon a system for controlling exoskeleton. The system includes an eyewear operably coupled with the exoskeleton. The eyewear includes an imaging device, a sensing device operatively coupled with the imaging device. The imaging device and the sensing device are configured to monitor one or more eye parameter of a user wearing the eyewear, and correspondingly generate a set of first signals. One or more processing unit is operatively coupled with the eyewear. The one or more processing unit includes a processor associated with a memory, and the one or more processing unit configured to receive the set of first signals from the eyewear. Extract the one or more eye parameters from the received set of first signals. The one or more processing unit is configured to match the extracted one or more eye parameters with pre-defined parameters stored in a dataset, and accordingly generate a set of control signals representing the movement of one or more limbs of the exoskeleton.
[0031] In an embodiment, the eye parameters can include any or combination of eyelid blink rate, gaze time, and eyeball movement rate.
[0032] In an embodiment, the exoskeleton can include one or more actuating devices operatively coupled with the one or more processing unit, and can be configured to move, based on the set of control signals, the one or more limbs of the exoskeleton.
[0033] In an embodiment, the one or more actuating device can include any or combination of electromagnetic actuator, magnetic actuator, and pneumatic actuator.
[0034] In an embodiment, the imaging device can include a camera.
[0035] In an embodiment, the sensing device can include an infrared (IR) sensor.
[0036] In an embodiment, the system can include a communication unit configured to communicatively couple the eyewear, one or more processing unit, and the exoskeleton.
[0037] In an embodiment, the communication unit can include any or combination of micro USB connector, Bluetooth module, WiFi module, and GSM module.
[0038] In another embodimentthe present disclosure relates to a method of controlling exoskeleton using eyes. The method includes detecting, by an imaging device, and a sensing device, one or more eye parameters of a user. Generating, by the imaging device and the sensing device, a set of signals based on detected the one or more eye parameters. Receiving, by one or more processing unit operatively coupled with the imaging device and the sensing device, the set of first signals. Extracting, by the one or more processing unit, the one or more eye parameters from the set of first signals. Matching, by the one or more processing device, the extracted one or more parameters with a pre-defined parameters stored in a dataset. Generating, by the one or more processing device, a set of control signals representing the movement of one or more limbs of the exoskeleton. Moving, by one or more actuating devices operatively coupled with the one or more processing units, the one or more limbs of the exoskeleton based on received the set of control signals.
[0039] FIG. 1 illustrates an exemplary block diagram 100 of proposed system for controlling the Exoskeleton, in accordance with an embodiment of the present disclosure.
[0040] As illustrated, the proposed block diagram 100 for controlling the exoskeleton system for controlling exoskeleton can include an eyewear 102. The eyewear 102 can be but not limited to a spectacle and an image capturing device (e.g. camera) can be mounted on the spectacle. The eyewear can be operably coupled with the exoskeleton. The eyewear 102 can include an imaging device (not shown) and a sensing device (not shown) that can be operatively coupled with the imaging device. The imaging device and the sensing device can be configured to monitor one or more eye parameter of a user wearing the eyewear, and correspondingly generate a set of first signals. The imaging device can be but not limited to a camera, and the sensing device can be but not limited to an infrared (IR) sensor. One or more processing unit104 can be operatively coupled with the eyewear 102.The one or more processing unit 104 can include one or more processors associated with a memory, and the one or more processing unit 104 can be configured to receive the set of first signals from the eyewear 102. Extract the one or more eye parameters from the received set of first signals. The one or more processing unit 103 can be configured to match the extracted one or more eye parameters with pre-defined parameters stored in a dataset, and accordingly generate a set of control signals representing the movement of one or more limbs 108 of the exoskeleton.
[0041] In an embodiment, the eye parameters can include but not limited to eyelid blink rate, gaze time, and eyeball movement rate. The exoskeleton can include one or more actuating devices 106operatively coupled with the one or more processing unit 104, and can be configured to move, based on the set of control signals, the one or more limbs 108 of the exoskeleton. The one or more actuating device 106 can include but not limited to electromagnetic actuator, magnetic actuator, and pneumatic actuator. The system can include a communication unit configured to communicatively couple the eyewear 102, one or more processing units 104, and the exoskeleton. The communication unit can include any or combination of micro USB connector, Bluetooth module, WiFi module, and GSM module.
[0042] FIG. 2 illustrates an exemplary module diagram 200 of systemfor controlling the Exoskeleton, in accordance with an embodiment of the present disclosure.
[0043] As illustrated, the module diagram 200of the system may comprise one or more processor(s) 202. The one or more processor(s) 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute computer-readable instructions stored in a memory 206 of the system 102. The memory 204 may store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 204 may comprise any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0044] The system may also comprise an interface(s) 206. The interface(s) 206 may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 may facilitate communication of system. The interface(s) 206 may also provide a communication pathway for one or more components of the system. Examples of such components include, but are not limited to, processing engine(s) 208 and data 210. For example,
[0045] The processing engine(s) 208 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 208. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 208 may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 208 may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) 208. In such examples, the system may comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to system and the processing resource. In other examples, the processing engine(s) 208 may be implemented by electronic circuitry.
[0046] The data 210 may comprise data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 208 or the system. With reference to FIG. 1, the present disclosure relates to a system for controlling a exoskeleton using one or more eye parameters. A pre-defined eye-parameters and the respective limb operation cab be stored in the dataset which can be used by the system to detect the desired operation of the limbs by tracking the eye parameters. For example two consecutive blinks of left eyelid within an interval of two seconds for a pre-defined interval time can be associated with movement of right arm or right leg. Similarly, a number of eye parameters can be associated with movement (also referred as dataset, herein) of one of more limbs 108 of the exoskeleton, and can be stored in the memory. The system can include an eyewear 102 that can be worn by a user. The eyewear can be used to track the eye parameters of the user, and generate a set of first signals representing the eye parameters (also referred as one or more eye parameters, herein). The set of first signals can be send to one or more processing units 104 having one or more processors 202 associated with memory 208. The set of first signals can be received by the one or more processing units 104 using a receiving engine 212. The
[0047] In an embodiment, the processors 202 can extract the eye parameters from the set of first signals using an extraction engine 214, and then the extracted eye parameters can be matched, using the matching engine 216, with the dataset to identify the associated limb movement with the extracted eye parameters. Based on the matching results, a set of control signals are generated representing desired or associated limb movement. The set of controls signals are sent to one or more actuating devices 106 for moving the associated limbs of the exoskeleton.
[0048] FIG. 3 illustrates an exemplary method 300 for controlling the Exoskeleton, in accordance with an embodiment of the present disclosure.
[0049] As illustrated, in step 302one or more eye parameters of a user wearing the eyewear 102 can be detected by an imaging device, and a sensing device present in the eyewear 102. In step 304, a set of signals based on detected the one or more eye parameters can be generated by the imaging device and the sensing device. In step 306, the set of first signals are received by the one or more processing unit operatively coupled with the imaging device and the sensing device. In step 308, the one or more eye parameters from the set of first signals are extracted by the one or more processing unit. In step 310, the extracted one or more parameters are matched by the one or more processing device, with a pre-defined parameters stored in a dataset. In step 312, a set of control signals representing the movement of one or more limbs of the exoskeleton, are generated by the one or more processing device. In step 314, the one or more limbs of the exoskeleton are moved by the one or more actuating device that are operatively coupled with the one or more processing units, based on received the set of control signals.
[0050] In an embodiment, the proposed exoskeleton can help a differently abled person in moving his/her skeleton by using his/her eyes only, unlike the conventional exoskeleton system that used complex brain signals instead. Eye movement (based on user’s ease) and the corresponding required action of a particular limb can be pre-stored in a database. At the time of using his particular limb, the used can perform a specific eye movement such as contraction the pupil etc. The eye movement can be captured by the imaging device mounted on an eyewear worn by the user. The imaging device can send a corresponding signal (also referred as set of first signals, herein) to processing unit. The associated action of a particular limb (such as gripping a glass, sitting, moving an arm etc.) can be detected and the exoskeleton can be signaled for performing the associated action. In this way the proposed system can simply the functioning of the exoskeleton.
[0051] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive patent matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “includes” and “including” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refer to at least one of something selected from the group consisting of A, B, C ….and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practised with modification within the spirit and scope of the appended claims.
[0052] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE INVENTION
[0053] The proposed invention provides a system for controlling the exoskeleton using eye parameter.
[0054] The proposed invention provides a system for controlling the exoskeleton at reduced cost.
[0055] The proposed invention provides a system for controlling the exoskeleton at reduced complexity.
Claims:1. A system for controlling exoskeleton, the system comprising:
an eyewear operably coupled with the exoskeleton, the eyewear comprising:
an imaging device,
a sensing device operatively coupled with the imaging device, wherein the imaging device, and the sensing device are configured to monitor one or more eye parameter of a user wearing the eyewear, and correspondingly generate a set of first signals, and
one or more processing unit, operatively coupled with the eyewear, comprising a processor associated with a memory, and the one or more processing unit configured to:
receive the set of first signals from the eyewear;
extract the one or more eye parameters from the received set of first signals, wherein the one or more processing unit is configured to match the extracted one or more eye parameters with a pre-defined parameters stored in a dataset, and accordingly generate a set of control signals representing the movement of one or more limbs of the exoskeleton.
2. The system as claimed in claim 1, wherein the eye parameters comprises any or combination of eyelid blink rate, gaze time, and eyeball movement rate.
3. The system as claimed in claim 1, wherein the exoskeleton comprises one or more actuating devices operatively coupled with the one or more processing unit, and configured to move, based on the set of control signals, the one or more limbs of the exoskeleton.
4. The system as claimed in claim 3, wherein the one or more actuating device comprises any or combination of electromagnetic actuator, magnetic actuator, and pneumatic actuator.
5. The system as claimed in claim 1, wherein the imaging device comprises a camera.
6. The system as claimed in claim 1, wherein the sensing device comprises an infrared (IR) sensor.
7. The system as claimed in claim 1, wherein the system comprises a communication unit configured to communicatively couple the eyewear, one or more processing unit, and the exoskeleton.
8. The system as claimed in claim 7, wherein the communication unit comprises any or combination of micro USB connector, Bluetooth module, WiFi module, and GSM module.
9. A method of controlling exoskeleton using eyes , the method comprising:
detecting, by an imaging device, and a sensing device, one or more eye parameters of a user;
generating, by the imaging device and the sensing device, a set of signals based on detected the one or more eye parameters;
receiving, by one or more processing unit operatively coupled with the imaging device and the sensing device, the set of first signals;
extracting, by the one or more processing unit, the one or more eye parameters from the set of first signals;
matching, by the one or more processing device, the extracted one or more parameters with a pre-defined parameters stored in a dataset;
generating, by the one or more processing device, a set of control signals representing the movement of one or more limbs of the exoskeleton; and
moving, by one or more actuating devices operatively coupled with the one or more processing units, the one or more limbs of the exoskeleton based on received the set of control signals.
| # | Name | Date |
|---|---|---|
| 1 | 202011041199-STATEMENT OF UNDERTAKING (FORM 3) [22-09-2020(online)].pdf | 2020-09-22 |
| 2 | 202011041199-FORM FOR STARTUP [22-09-2020(online)].pdf | 2020-09-22 |
| 3 | 202011041199-FORM FOR SMALL ENTITY(FORM-28) [22-09-2020(online)].pdf | 2020-09-22 |
| 4 | 202011041199-FORM 1 [22-09-2020(online)].pdf | 2020-09-22 |
| 5 | 202011041199-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [22-09-2020(online)].pdf | 2020-09-22 |
| 6 | 202011041199-EVIDENCE FOR REGISTRATION UNDER SSI [22-09-2020(online)].pdf | 2020-09-22 |
| 7 | 202011041199-DRAWINGS [22-09-2020(online)].pdf | 2020-09-22 |
| 8 | 202011041199-DECLARATION OF INVENTORSHIP (FORM 5) [22-09-2020(online)].pdf | 2020-09-22 |
| 9 | 202011041199-COMPLETE SPECIFICATION [22-09-2020(online)].pdf | 2020-09-22 |
| 10 | 202011041199-Proof of Right [08-10-2020(online)].pdf | 2020-10-08 |
| 11 | 202011041199-FORM-26 [08-10-2020(online)].pdf | 2020-10-08 |
| 12 | 202011041199-FORM 18 [16-06-2022(online)].pdf | 2022-06-16 |
| 13 | 202011041199-FER.pdf | 2022-10-04 |
| 14 | 202011041199-FER_SER_REPLY [15-03-2023(online)].pdf | 2023-03-15 |
| 15 | 202011041199-DRAWING [15-03-2023(online)].pdf | 2023-03-15 |
| 16 | 202011041199-CORRESPONDENCE [15-03-2023(online)].pdf | 2023-03-15 |
| 17 | 202011041199-CLAIMS [15-03-2023(online)].pdf | 2023-03-15 |
| 1 | searchhh3(11)E_29-09-2022.pdf |
| 2 | 202011041199AE_30-07-2024.pdf |