Abstract: Disclosed is a pupil-guided conveyance system for a user with quadriplegia, comprising a primary traction cylinder operationally connected with an initial coupling element, featuring a centreless indented rim enclosure with a contact layer. This setup is crucial for navigating a conveyance apparatus, which the user controls via a sophisticated central actuating unit. This unit utilizes a translatable rod to adjust the primary traction cylinder's position, allowing for precise movement control. Integral to the system's functionality is a central processing unit (CPU) that processes inputs from both a camera module and an ultrasonic sensor. This processing capability enables the system to offer real-time navigation and obstacle avoidance based on the user's eye movements. Additionally, the system is equipped with a display for visual feedback, a motor driver for movement control, and a power supply unit to ensure all components receive necessary power. Auditory feedback for various notifications is provided through a buzzer. One of the system's most innovative features is its ability to interpret the user's eye movements and blinks as commands, allowing for start and stop control of the conveyance apparatus. This feature, combined with differential drive steering and movement responsive to eye pupil movement, significantly enhances mobility for users with severe physical limitations. The system represents a significant advancement in assistive technology, offering users with quadriplegia a new level of independence in their daily lives. Drawings /Fig. 1 / Fig. 2 /Fig. 3
Description:Field of the Invention
The present disclosure generally relates to mobility assistance technologies and particularly to a pupil-guided conveyance system for users with quadriplegia.
Background
The background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
Quadriplegia, also known as Tetraplegia, is characterised by the paralysis of all four limbs and the torso, typically resulting from spinal cord injury or other neurological disorders. Said severe disability profoundly impacts the independence and quality of life of the affected individuals, as the partial or total loss of use and sensation in the limbs and torso is experienced. The causes of quadriplegia vary, encompassing birth defects, sports injuries, military training accidents, and other incidents that damage the brain or spinal cord.
For individuals living with quadriplegia, mobility becomes a significantly complex challenge, necessitating solutions to navigate their environment. Traditional wheelchairs, while important for mobility, often require manual operation or external assistance, which may not always be feasible for those with quadriplegia. The operational muscles that remain under voluntary control, notably the eye muscles, present an opportunity for technological research. The ability to move the eyes freely can be harnessed to control a wheelchair, thereby providing a method of interaction that leverages the intact function of eye movement.
However, the current technologies for enabling individuals with quadriplegia to navigate their environment exhibit several limitations. Traditional wheelchair control mechanisms that rely on manual operation or limited physical inputs are not suitable for individuals with severe mobility restrictions. Although several alternative control systems have been developed, including voice recognition and head movement detection, said systems often lack precision, are prone to errors, and may not function effectively in all environmental conditions.
Voice recognition systems, for example, can be affected by background noise, speech impairments, and the requirement for clear articulation, which might not be possible for all users. Head movement detection systems require a certain degree of neck control, which may not be present in individuals with high-level quadriplegia. Furthermore, such systems can be tiring to use over extended periods and may not provide the level of intuitive control that is desired.
The integration of eye-tracking technology into mobility solutions for individuals with quadriplegia has been explored as an alternative. Said technology capitalises on the voluntary control over eye movement that most individuals with quadriplegia retain. Despite said, existing eye-tracking systems have their own set of challenges. Calibration difficulties, the need for a direct line of sight to the tracking interface, and the inability to operate in bright sunlight or other challenging lighting conditions limit the practicality and effectiveness of said systems. Additionally, the cognitive load required to operate such interfaces can be significant, leading to fatigue and reducing the overall usability of the system for daily activities.
Prior art systems failed to offer precise, intuitive control mechanisms that do not rely heavily on physical capabilities or environmental conditions. Prior art systems cannot improve the quality of life for individuals living with quadriplegia, providing them with greater autonomy and the ability to navigate their environment more effectively. In light of the above discussion, there exists an urgent need for a pupil-guided conveyance system that overcome the drawbacks associated with conventional systems and techniques for enhancing mobility and independence for individuals with quadriplegia.
Summary
The following presents a simplified summary of various aspects of this disclosure in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical elements nor delineate the scope of such aspects. Its purpose is to present some concepts of this disclosure in a simplified form as a prelude to the more detailed description that is presented later.
The following paragraphs provide additional support for the claims of the subject application.
The disclosure pertains to a pupil-guided conveyance system for a user with quadriplegia. Said conveyance system comprising a primary traction cylinder operationally connected with an initial coupling element, featuring a centreless indented rim enclosure with a contact layer. A principal actuator, arranged with the primary traction cylinder, is configured to confer rotational motion upon said primary traction cylinder.
Additionally, a central actuating unit, attached to a conveyance apparatus, includes a translatable rod that modifies the locus of the initial coupling element in relation to the conveyance apparatus. Said arrangement enables the primary traction cylinder to transition between a non-contact position and a contact position with the leading circle of the conveyance apparatus at an interaction juncture. A seating zone is situated aft of said interaction juncture, where the translatable rod is joined with the initial coupling element.
Further, the translatable rod comprises an actuation means configured to effectuate the elongation and withdrawal of said translatable rod. The adjustment in the spatial orientation of the initial coupling element facilitated by said translatable rod enables the positioning of the primary traction cylinder in both non-contact and contact positions relative to the leading circle of the conveyance apparatus.
Moreover, the system incorporates a central processing unit that processes image inputs from a camera module and data from an ultrasonic sensor to control the motors of the conveyance apparatus through a motor driver. Said central processing unit facilitates real-time navigation and obstacle avoidance based on the eye movement of the user. A display connected to the central processing unit provides visual feedback to the user, including system status, navigation paths, and error messages.
Additionally, a motor driver receiving control signals from the central processing unit supplies power to the motors for controlling the movements of the conveyance apparatus. A power supply unit provides electrical power to the central processing unit, the camera module, the display, and the motor driver. A buzzer, interfaced with the central processing unit, emits auditory feedback for obstacle proximity, system errors, or notifications to the user.
Motors configured to control wheels of the conveyance apparatus enable differential drive steering and a range of movements responsive to the eye pupil movement detected by the camera module. The central processing unit is further configured to issue start and stop commands to the conveyance apparatus through an eye blinking logic based on the analysis of captured real-time images to detect and track eye pupil movement of the user.
Brief Description of the Drawings
The features and advantages of the present disclosure would be more clearly understood from the following description taken in conjunction with the accompanying drawings in which:
FIG. 1 illustrates a pupil-guided conveyance system for a user with quadriplegia, in accordance with the embodiments of the present disclosure.
FIG. 2 illustrates a block diagram of functional components operatively coupled with the pupil-guided conveyance system, in accordance with the embodiments of the present disclosure.
FIG. 3 illustrates a flowchart of the processes involved in the pupil-guided conveyance system, in accordance with the embodiments of the present disclosure.
Detailed Description
In the following detailed description of the invention, reference is made to the accompanying drawings that form a part hereof, and in which is shown, by way of illustration, specific embodiments in which the invention may be practiced. In the drawings, like numerals describe substantially similar components throughout the several views. These embodiments are described in sufficient detail to claim those skilled in the art to practice the invention. Other embodiments may be utilized and structural, logical, and electrical changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims and equivalents thereof.
The use of the terms “a” and “an” and “the” and “at least one” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The use of the term “at least one” followed by a list of one or more items (for example, “at least one of A and B”) is to be construed to mean one item selected from the listed items (A or B) or any combination of two or more of the listed items (A and B), unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including,” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
Pursuant to the "Detailed Description" section herein, whenever an element is explicitly associated with a specific numeral for the first time, such association shall be deemed consistent and applicable throughout the entirety of the "Detailed Description" section, unless otherwise expressly stated or contradicted by the context.
The present disclosure unveils a pupil-guided conveyance system 100 for a user with quadriplegia. According to a pictorial illustration of FIG. 1, showcasing an architectural paradigm of the system 100 that can comprise functional elements, yet not limited to a primary traction cylinder 102, an initial coupling element 104, indented rim enclosure 106, a principal actuator 108, a central actuating unit 110, a translatable rod 110a, a conveyance apparatus 112, and a seating zone 114. A person ordinarily skilled in art would prefer those elements or components of the system 100, to be functionally or operationally coupled to/ with each other, in accordance with the embodiments of present disclosure.
In an embodiment, the primary traction cylinder 102, as referenced in the system 100, relates to a component arranged for imparting movement to the conveyance system 100. Said primary traction cylinder 102, operationally connected with an initial coupling element 104, incorporates a centreless indented rim enclosure 106 equipped with a contact layer. The structure of the primary traction cylinder 102, specifically the centreless indented rim enclosure 106, contributes significantly to the reduction of friction during operation. Said reduction in friction enables smoother movement of the conveyance system 100, thereby enhancing the control over the system 100 by the user. Additionally, the contact layer provides a durable interface with the system 100, maintaining longevity and reliability in the operation.
In an embodiment, the principal actuator 108, arranged with the primary traction cylinder 102, plays a pivotal role in the system 100 by conferring rotational motion upon the primary traction cylinder 102. The ability of the principal actuator 108 to precisely control the rotational motion of the primary traction cylinder 102 directly impacts the responsiveness of the conveyance system 100 to user input. Said precision in control allows user with quadriplegia to navigate the conveyance system 100 efficiently, offering them an unprecedented level of autonomy in mobility. The technological aspects embodied in the principal actuator 108 underscores the importance in facilitating seamless interaction between the user and the conveyance system 100.
In an embodiment, the central actuating unit 110, which is an attachment to a conveyance apparatus 112, includes a translatable rod 110a that modifies the locus of the initial coupling element 104 in relation to the conveyance apparatus 112 (may relate to, yet not limited to an integral combination of functional elements such as wheels, sprocket, axle, shaft, sprocket, chains, a control handle and the like). The central actuating unit 110, through the action of the translatable rod 110a, enables the primary traction cylinder 102 to alternate between a non-contact position and a contact position of engagement with the leading circle of the conveyance apparatus 112 at an interaction juncture. Said mechanism of engagement and disengagement with the conveyance apparatus 112 allows for precise control over the movement of the system 100, enhancing manoeuvrability and safety for the user. The technical effect of the structure of said central actuating unit 110 is important for achieving the desired level of precision in movement control, directly contributing to the efficacy of the system 100 in providing mobility assistance to user with quadriplegia.
In an embodiment, the seating zone 114, situated aft of the interaction juncture of the conveyance apparatus 112, provides a secure and comfortable position for the user within the system 100. The placement of the seating zone 114 maintains the stability of the user during the operation of the conveyance system 100 and also optimizes the user ability to interact with the pupil-guidance interface. Said optimization is important for enabling user with quadriplegia to effectively control the conveyance system 100 using minimal and intuitive inputs. The comfort and support offered by the seating zone 114 play a significant role in enhancing the overall user experience, making the conveyance system 100 a practical solution for improving mobility.
Referring to one or more preceding embodiments, each component of the pupil-guided conveyance system 100 is meticulously arranged to work in harmony, providing a user with quadriplegia a level of independence and mobility. The emphasis of the system 100 on reducing friction, enhancing control precision, and maintaining user comfort and stability demonstrates an approach to structuring mobility aids. The conveyance system 100 represents a technological aspect in the field of assistive devices but also embodies a commitment to improving the quality of life for individuals with mobility impairments.
In an embodiment, the translatable rod 110a, as referenced in the system 100, further comprises an actuation means. Said actuation means is configured to effectuate the elongation and withdrawal of the translatable rod 110a. The incorporation of the actuation means into the translatable rod 110a enhances the ability of the system 100 to adjust the distance and orientation between the initial coupling element 104 and the conveyance apparatus 112 dynamically. Said dynamic adjustment capability is significant for facilitating precise control over the positioning and movement of the system 100, so that the conveyance system 100 can adapt to varying operational requirements. The technical effect of such an actuation means includes improved manoeuvrability and responsiveness of the conveyance system 100, directly contributing to a safer and more efficient navigation experience for user with quadriplegia.
In an embodiment, the translatable rod 110a also adjusts the spatial orientation of the initial coupling element 104 with respect to the conveyance apparatus 112. Said adjustment of the spatial orientation facilitates the positioning of the primary traction cylinder 102 in the non-contact position and the contact position relative to the leading circle of the conveyance apparatus 112. Said adjustment capability to adjust the spatial orientation and positioning of the primary traction cylinder 102 enables the system 100 to engage and disengage with the conveyance apparatus 112 as needed, allowing for seamless transitions between movement and stationary states. The precise control over the engagement mechanism enhances the overall efficiency and safety of the conveyance system 100, providing user with a reliable mode of mobility.
In an embodiment, the system 100 further comprises a central processing unit. Said central processing unit is tasked with processing image inputs from a camera module and data from an ultrasonic sensor to control the motors of the conveyance apparatus 112 through a motor driver. Said configuration allows for real-time navigation and obstacle avoidance based on the eye movement of the user. The central processing unit can integrate and analyse data from multiple sources in real-time is important for facilitating the conveyance system 100 can navigate complex environments safely and efficiently. The resultant technical effect includes enhanced situational awareness and adaptive response capabilities, significantly improving the autonomy and mobility of the user.
In an embodiment, the display connected to the central processing unit is included within the system 100. Said display is configured to provide visual feedback to the user, including system status, navigation paths, and error messages. The availability of real-time visual feedback on the display allows user to monitor the performance of said system 100 and navigate more effectively. Said feedback feature is instrumental in enhancing the confidence of the user and control over the conveyance system 100, facilitating a more intuitive and interactive mobility solution.
In an embodiment, the system 100 further comprises a motor driver receiving control signals from the central processing unit. Said motor driver is configured to supply power to the motors for controlling the movements of the conveyance apparatus 112. The motor driver can translate control signals into precise motor actions is pivotal for the accurate execution of user commands, directly affecting the manoeuvrability and responsiveness of the system 100. Said arrangement maintains that the conveyance system 100 can perform a wide range of movements smoothly and reliably, based on the intentions of the user as communicated through eye movements.
In an embodiment, a power supply unit is included to provide electrical power to the central processing unit, the camera module, the display, and the motor driver. The integration of a dedicated power supply unit maintains that all important components of the conveyance system 100 receive a stable and sufficient power supply, significant for maintaining continuous and reliable operation. The provision of electrical power to said components enables the system 100 to function effectively, even in extended usage scenarios, thereby enhancing the usability and dependability of the system 100 for the user with quadriplegia.
In an embodiment, the system 100 further comprises a buzzer interfaced with the central processing unit. Said buzzer is configured to emit auditory feedback for obstacle proximity, system errors, or notifications to the user. The inclusion of auditory feedback mechanisms serves as a significant complement to visual feedback, providing an additional layer of user interaction and alertness. Said auditory feedback is particularly beneficial in situations where visual attention may be divided, maintaining that user remain informed of the states and environmental factors of said system 100, thereby enhancing safety and user experience.
In an embodiment, the motors configured to control wheels of the conveyance apparatus 112 enable differential drive steering and a range of movements responsive to the eye pupil movement detected by the camera module. The configuration of said motors facilitates precise and adaptable movement control, allowing the conveyance system 100 to execute complex manoeuvres based on nuanced user inputs. Said manoeuvring capability significantly extends the mobility options available to user with quadriplegia, offering them an unprecedented level of control and freedom in navigation.
In an embodiment, the central processing unit is further configured to issue start and stop commands to the conveyance apparatus 112 through an eye blinking logic based on the analysis of captured real-time images to detect and track eye pupil movement of the user. Said issuance of start and stop commands allows for a highly intuitive control mechanism, enabling the user to easily initiate or halt the conveyance system 100 movements through simple eye blinks. The implementation of eye blinking logic for command execution underscores the commitment of said system 100 to providing a user-friendly, accessible mobility solution for individuals with quadriplegia, enhancing the autonomy and interaction with the environment.
Referring to the preceding embodiment, the system 100 can relate to, yet not restricted to a "Pupil-Tracking Wheelchair Interface for Quadriplegics," or the Eye Monitored Wheelchair, represents a significant aspect in assistive technology, structured to provide quadriplegic individuals with a greater level of independence and mobility. Said system 100 utilizes a combination of eye-tracking image processing, real-time obstacle detection, and control mechanisms powered by Raspberry Pi technology to enable users to navigate their environment autonomously.
FIG. 2 illustrates a block diagram of functional components operatively coupled with the conveyance system 100. Central to said system 100 is the central processing unit (may relate to Raspberry Pi 3 Model B), serving as the controller of the operation. The central processing unit processes inputs from the camera (can be a part of the camera module) and the ultrasonic sensor to direct the wheelchair (may relate to the conveyance apparatus 112) movement via motor drivers. The camera plays a crucial role, capturing the user eye movements to determine navigation commands, while a connected display offers visual feedback about the system 100 status and the navigation paths. A power supply maintains said components receive the necessary electrical power, and the motor driver acts as a bridge, translating the commands from the Raspberry Pi into action by powering the wheelchair motors. Said motors, capable of differential drive steering, allow for a broad range of movements, further enhanced by an ultrasonic sensor that aids in navigation by detecting obstacles.
Moreover, auditory feedback is provided through a buzzer for alerts related to obstacle proximity, system errors, or other notifications, maintains the user is always aware of the system status and the surrounding environment.
Referring to the preceding embodiment, the operation of said system 100 is underpinned by computer vision technology, employing a series of stages including face and eye detection, colour conversion, edge detection, and Hough transform to accurately track the eye pupil's movement. Initially, the system 100 captures images of the face of the user, using algorithms like Haar Cascade to identify the face and eyes. Subsequent image processing operations aim to pinpoint the eye pupil and the centre, an important step for interpreting the intended direction of movement.
FIG. 3 illustrates a flowchart of the processes involved in the conveyance system 100. Said processes begins with the camera module capturing the facial features of the user. Said data is then processed through a series of algorithms to detect and track the movement of the eyes of the user.
Referring to the preceding embodiment, the HAAR cascade algorithm is applied to the input from the camera module for face and eye detection. Said algorithm is particularly effective at identifying facial features within images due to the ability to quickly process visual data.
Following the initial detection, Canny Edge Detection is employed for edge detection. Said algorithm detects a wide range of edges in images, which is significant for accurately outlining the eyes. After detecting the edges, the Hough Circle Transform is applied to delineate the circular shapes of the pupils. Said delineation is an important step in identifying and localizing the pupils within the eye regions.
Referring to the preceding embodiment, the Hough Transform, a separate step from the Hough Circle Transform, is used for features detection. The Hough Transform extends the edge detection by identifying the general shapes and positions of various features within the eye region. Finally, the system 100 localizes the centre of the eye pupil, which is an important step for accurate eye tracking. The pupil centre localization is vital for determining the direction of the gaze of the user.
Together, said processes of face and eye detection, edge detection, features detection, and eye pupil centre localization, converge to accurately track eye movement. The resulting eye tracking data can be used for various applications, including the pupil-guided conveyance system 100 described previously, which relies on eye movement for navigation control.
Referring to the preceding embodiment, the system 100 operates by isolating the eye region from an image to detect the presence of a circular pattern, indicative of the eye pupil. Upon successful identification of the eye ball, a corner detection method is applied within the isolated eye region to ascertain the extremities of the eye. The average of the identified corner points is computed to establish the centre of the eye. The system 100 then measures the distance between the centre of the eye and the centre of the pupil, with varying distances correlating to different eye movements. A minimal distance indicates that the eye has moved left, while a maximal distance suggests a rightward movement. If the distance remains unchanged, interpreted that the eye is centred.
Referring to the preceding embodiment, the wheelchair mobility is controlled through said detection of eye movement. A leftward glance activates the wheelchair motors (may relate to one or more motors located to a left side of said conveyance apparatus 112), and a rightward glance engages with the motors (located to a right side of said conveyance apparatus 112). When the eye is centred, said motors operate simultaneously to drive the wheelchair forward. Additionally, the system 100 employs an eye blinking logic to manage the start and stop functions of the wheelchair. The system halts completely when the eye is closed for three seconds and reactivates upon another three-second closure.
Referring to the preceding embodiment, the image capturing and processing are handled by a high-pixel-rate webcam and processed on a Raspbian system. In a state of non-operation, the system 100 assumes the eye is open. The system 100 is activated once power is supplied and responds to command values accordingly.
Referring to the preceding embodiment, the system 100 enhances mobility for individuals with quadriplegia by enabling them to steer a wheelchair with eye movements. The system 100 promotes independence by reducing the need for caregiver assistance. Ultrasonic sensors are integrated to facilitate safe navigation. The interface of the system 100 is structured to be user-friendly for ease of use. The system 100 quickly translates eye movements into wheelchair movement commands, maintaining timely and effective navigation. By enabling independent movement, said system 100 improves social inclusion and interaction for the user. The customizable features may also be adapted to meet individual user needs, offering optimal control and comfort.
Based on said analysis, the system 100 generates commands to control the wheelchair motors, adapting in real-time to the user eye movements. Safety is a key concern, with ultrasonic sensors actively detecting any the obstacles to prevent collisions. Continuous monitoring of eye movements facilitates the direction of the wheelchair (may relate to, yet not limited to said conveyance apparatus 112) can be adjusted as needed, while the user receives feedback through both visual and auditory means.
Referring to the preceding embodiment, the eye-monitored conveyance system 100 is structured to be intuitive, allowing for a quick and efficient translation of the intentions of the user into mobility. Said conveyance system 100 not only enhances the mobility of quadriplegic individuals but also significantly contributes to the independence, safety, and ability to participate more fully in social activities. Customizability maintains that the system 100 can be adapted to meet the unique needs and preferences of individual users, further enhancing the utility and impact.
Example embodiments herein have been described above with reference to block diagrams and flowchart illustrations of methods and apparatuses. It will be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by various means including hardware, software, firmware, and a combination thereof. For example, in one embodiment, each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations can be implemented by computer program instructions. These computer program instructions may be loaded onto a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks.
Throughout the present disclosure, the term ‘processing means’ or ‘microprocessor’ or ‘processor’ or ‘processors’ includes, but is not limited to, a general purpose processor (such as, for example, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a microprocessor implementing other types of instruction sets, or a microprocessor implementing a combination of types of instruction sets) or a specialized processor (such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), or a network processor).
The term “non-transitory storage device” or “storage” or “memory,” as used herein relates to a random-access memory, read only memory and variants thereof, in which a computer can store data or software for any duration.
Operations in accordance with a variety of aspects of the disclosure is described above would not have to be performed in the precise order described. Rather, various steps can be handled in reverse order or simultaneously or not at all.
While several implementations have been described and illustrated herein, a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein may be utilized, and each of such variations and/or modifications is deemed to be within the scope of the implementations described herein. More generally, all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific implementations described herein. It is, therefore, to be understood that the foregoing implementations are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, implementations may be practiced otherwise than as specifically described and claimed. Implementations of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
Claims
I/We claims:
A pupil-guided conveyance system 100 for a user with quadriplegia, the system 100 comprising:
a primary traction cylinder 102 operationally connected with an initial coupling element 104, wherein the primary traction cylinder 102 incorporates a centreless indented rim enclosure 106 with a contact layer;
a principal actuator 108 arranged with the primary traction cylinder 102, wherein the principal actuator 108 is configured to confer rotational motion upon the primary traction cylinder 102;
a central actuating unit 110 attachment with a conveyance apparatus 112, wherein the central actuating unit 110 comprises:
a translatable rod 110a modifies the locus of the initial coupling element 104 in relation to the conveyance apparatus 112, wherein the primary traction cylinder 102 is arranged to:
be in a non-contact position with a leading circle of the conveyance apparatus 112;
be in a contact position of engagement with the leading circle of the conveyance apparatus 112 at an interaction juncture; and
a seating zone 114 is situated aft of the interaction juncture of the conveyance apparatus (112), wherein the translatable rod 110a is joined with the initial coupling element 104.
The pupil-guided conveyance system 100 as claimed in claim 1, wherein the translatable rod 110a further comprises an actuation means configured to effectuate the elongation and withdrawal of said translatable rod 110a.
The pupil-guided conveyance system 100 as claimed in claim 1, wherein the translatable rod 110a adjusts the spatial orientation of the initial coupling element (104) with respect to the conveyance apparatus (112), wherein said adjustment of the spatial orientation facilitates the positioning of the primary traction cylinder (102) in the non-contact position and the contact position relative to the leading circle of the conveyance apparatus (112).
The pupil-guided conveyance system 100 as claimed in claim 1, further comprises a central processing unit to:
process image inputs from a camera module; and
process data from an ultrasonic sensor to control the motors of said conveyance apparatus 112 through a motor driver, wherein said central processing unit facilitates real-time navigation and obstacle avoidance based on the eye movement of said user.
The pupil-guided conveyance system 100 as claimed in claim 1, further comprises a display connected to the central processing unit, wherein said display is configured to provide visual feedback to the user including system status, navigation paths, and error messages.
The pupil-guided conveyance system 100 as claimed in claim 1, further comprises said motor driver receiving control signals from the central processing unit, wherein said motor driver is configured to supply power to the motors for controlling the movements of said conveyance apparatus 112.
The pupil-guided conveyance system 100 as claimed in claim 1, further comprises a power supply unit to provide electrical power to the central processing unit, the camera module, the display, and the motor driver.
The pupil-guided conveyance system 100 as claimed in claim 1, further comprises a buzzer interfaced with the central processing unit, wherein said buzzer is configured to emit auditory feedback for obstacle proximity, system errors, or notifications to the user.
The pupil-guided conveyance system 100 as claimed in claim 1, wherein said motors configured to control wheels of the conveyance apparatus (112), for differential drive steering and a range of movements responsive to the eye pupil movement detected by the camera module.
The pupil-guided conveyance system 100 as claimed in claim 1, wherein the central processing unit is further configured to issue start and stop commands to the conveyance apparatus (112) through an eye blinking logic based on the analysis of the captured real-time images to detect and track eye pupil movement of the user.
PUPIL-TRACKING WHEELCHAIR INTERFACE FOR QUADRIPLEGICS
Disclosed is a pupil-guided conveyance system for a user with quadriplegia, comprising a primary traction cylinder operationally connected with an initial coupling element, featuring a centreless indented rim enclosure with a contact layer. This setup is crucial for navigating a conveyance apparatus, which the user controls via a sophisticated central actuating unit. This unit utilizes a translatable rod to adjust the primary traction cylinder's position, allowing for precise movement control. Integral to the system's functionality is a central processing unit (CPU) that processes inputs from both a camera module and an ultrasonic sensor. This processing capability enables the system to offer real-time navigation and obstacle avoidance based on the user's eye movements. Additionally, the system is equipped with a display for visual feedback, a motor driver for movement control, and a power supply unit to ensure all components receive necessary power. Auditory feedback for various notifications is provided through a buzzer. One of the system's most innovative features is its ability to interpret the user's eye movements and blinks as commands, allowing for start and stop control of the conveyance apparatus. This feature, combined with differential drive steering and movement responsive to eye pupil movement, significantly enhances mobility for users with severe physical limitations. The system represents a significant advancement in assistive technology, offering users with quadriplegia a new level of independence in their daily lives.
Drawings
/Fig. 1
/ Fig. 2
/Fig. 3
, Claims:I/We claims:
A pupil-guided conveyance system 100 for a user with quadriplegia, the system 100 comprising:
a primary traction cylinder 102 operationally connected with an initial coupling element 104, wherein the primary traction cylinder 102 incorporates a centreless indented rim enclosure 106 with a contact layer;
a principal actuator 108 arranged with the primary traction cylinder 102, wherein the principal actuator 108 is configured to confer rotational motion upon the primary traction cylinder 102;
a central actuating unit 110 attachment with a conveyance apparatus 112, wherein the central actuating unit 110 comprises:
a translatable rod 110a modifies the locus of the initial coupling element 104 in relation to the conveyance apparatus 112, wherein the primary traction cylinder 102 is arranged to:
be in a non-contact position with a leading circle of the conveyance apparatus 112;
be in a contact position of engagement with the leading circle of the conveyance apparatus 112 at an interaction juncture; and
a seating zone 114 is situated aft of the interaction juncture of the conveyance apparatus (112), wherein the translatable rod 110a is joined with the initial coupling element 104.
The pupil-guided conveyance system 100 as claimed in claim 1, wherein the translatable rod 110a further comprises an actuation means configured to effectuate the elongation and withdrawal of said translatable rod 110a.
The pupil-guided conveyance system 100 as claimed in claim 1, wherein the translatable rod 110a adjusts the spatial orientation of the initial coupling element (104) with respect to the conveyance apparatus (112), wherein said adjustment of the spatial orientation facilitates the positioning of the primary traction cylinder (102) in the non-contact position and the contact position relative to the leading circle of the conveyance apparatus (112).
The pupil-guided conveyance system 100 as claimed in claim 1, further comprises a central processing unit to:
process image inputs from a camera module; and
process data from an ultrasonic sensor to control the motors of said conveyance apparatus 112 through a motor driver, wherein said central processing unit facilitates real-time navigation and obstacle avoidance based on the eye movement of said user.
The pupil-guided conveyance system 100 as claimed in claim 1, further comprises a display connected to the central processing unit, wherein said display is configured to provide visual feedback to the user including system status, navigation paths, and error messages.
The pupil-guided conveyance system 100 as claimed in claim 1, further comprises said motor driver receiving control signals from the central processing unit, wherein said motor driver is configured to supply power to the motors for controlling the movements of said conveyance apparatus 112.
The pupil-guided conveyance system 100 as claimed in claim 1, further comprises a power supply unit to provide electrical power to the central processing unit, the camera module, the display, and the motor driver.
The pupil-guided conveyance system 100 as claimed in claim 1, further comprises a buzzer interfaced with the central processing unit, wherein said buzzer is configured to emit auditory feedback for obstacle proximity, system errors, or notifications to the user.
The pupil-guided conveyance system 100 as claimed in claim 1, wherein said motors configured to control wheels of the conveyance apparatus (112), for differential drive steering and a range of movements responsive to the eye pupil movement detected by the camera module.
The pupil-guided conveyance system 100 as claimed in claim 1, wherein the central processing unit is further configured to issue start and stop commands to the conveyance apparatus (112) through an eye blinking logic based on the analysis of the captured real-time images to detect and track eye pupil movement of the user.
PUPIL-TRACKING WHEELCHAIR INTERFACE FOR QUADRIPLEGICS
| # | Name | Date |
|---|---|---|
| 1 | 202421033110-OTHERS [26-04-2024(online)].pdf | 2024-04-26 |
| 2 | 202421033110-FORM FOR SMALL ENTITY(FORM-28) [26-04-2024(online)].pdf | 2024-04-26 |
| 3 | 202421033110-FORM 1 [26-04-2024(online)].pdf | 2024-04-26 |
| 4 | 202421033110-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [26-04-2024(online)].pdf | 2024-04-26 |
| 5 | 202421033110-EDUCATIONAL INSTITUTION(S) [26-04-2024(online)].pdf | 2024-04-26 |
| 6 | 202421033110-DRAWINGS [26-04-2024(online)].pdf | 2024-04-26 |
| 7 | 202421033110-DECLARATION OF INVENTORSHIP (FORM 5) [26-04-2024(online)].pdf | 2024-04-26 |
| 8 | 202421033110-COMPLETE SPECIFICATION [26-04-2024(online)].pdf | 2024-04-26 |
| 9 | 202421033110-FORM-9 [07-05-2024(online)].pdf | 2024-05-07 |
| 10 | 202421033110-FORM 18 [08-05-2024(online)].pdf | 2024-05-08 |
| 11 | 202421033110-FORM-26 [12-05-2024(online)].pdf | 2024-05-12 |
| 12 | 202421033110-FORM 3 [13-06-2024(online)].pdf | 2024-06-13 |
| 13 | 202421033110-RELEVANT DOCUMENTS [09-10-2024(online)].pdf | 2024-10-09 |
| 14 | 202421033110-POA [09-10-2024(online)].pdf | 2024-10-09 |
| 15 | 202421033110-FORM 13 [09-10-2024(online)].pdf | 2024-10-09 |