Sign In to Follow Application
View All Documents & Correspondence

Wheelchair Mobility System For Handicapped Humans

Abstract: The present disclosure discloses an automatically controllable smart wheel chair device comprising a frame forming a body of the device. It includes a voice capturing unit configured to capture a set of acoustic signals generated by a user; an image capturing device configured to capture a set of images of eyes of the user; and a hand gesturing capturing glove configured to monitor hand gestures of the user. It also includes a processing unit configured to generate a set of digital signatures, and determine any or a combination of a voice command of the user, an eye movement of the user, and hand gesture of the user. Further, it generate a set of control signals that correspond to one or more driving instructions associated with wheel chair device. Further, it also includes a plurality of electrical motors configured with wheels of the device, and correspondingly operate the device.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 November 2021
Publication Number
21/2023
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
info@khuranaandkhurana.com
Parent Application

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector - 9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. LILHORE, Umesh Kumar
Associate Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
2. SIMAIYA, Sarita
Assistant Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
3. AHUJA, Sachin
Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
4. KHURANA, Meenu
Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
5. KAUR, Amandeep
Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
6. SANDHU, Jasminder
Assistant Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
7. MANHAR, Advin
Assistant Professor, Amity University Manth (Kharora), State Highway 9, Raipur - Baloda Bazar Rd, Raipur, Chhattisgarh - 493225, India.
8. AHMED, Mohammed Bakhtawar
Assistant Professor, Amity University Manth (Kharora), State Highway 9, Raipur - Baloda Bazar Rd, Raipur, Chhattisgarh - 493225, India.
9. HARNAL, Shilpi
Assistant Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
10. SNEHI, Jyoti
Assistant Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.

Specification

The present disclosure relates to the field of artificial intelligence. In
particular, the present disclosure provides a wheelchair mobility system for handicapped humans.
BACKGROUND
[0002] According to World ealth Organigation (WHO) disability report, a
few outstanding figures of disability around the world are :
a. 466 million people have a disabling deafness and hearing loss. This
represents 6% of the world's population
b. About 200 million people have an intellectual disability (IQ below 75). This
represents 2.6% of the world's population.
[0003] Life of a disable person is not easy. Every step they take may be met
with a number of challenges. This is significantly worse for people who are living
alone and also physically disabled. As a result, they cannot afford to move around
freely. Furthermore, public spaces are rarely designed for people with disabilities.
There are no ramps, and hallways are too narrow for them to pass through.
[0004] Inventions that have been made in the area of intelligent wheelchair
mobility for disabled won't be useful for people suffering from disabling deafness and hearing loss and people having low IQ.
[0005] There is, therefore, a need to provide an efficient, optimum, and
cost-effective device that can obviate the above-mentioned limitations.
OBJECTS OF THE PRESENT DISCLOSURE
[0006] A general object of the present disclosure is to obviate the above-
mentioned problems and aid disabled people.
[0007] Another object of the present disclosure is to provide an intelligent
wheelchair mobility device for the disabled.
[0008] Another object of the present disclosure is to provide an intelligent
wheelchair mobility device for the disabled that can be operated via voice command.

[0009] Another object of the present disclosure is to provide an intelligent
wheelchair mobility device for the disabled that can be operated via eye movement.
[0010] Another object of the present disclosure is to provide an intelligent
wheelchair mobility device for the disabled that can be operated via hand gesture.
[0011] Another object of the present disclosure is to provide an intelligent
wheelchair mobility device for the disabled that can be autonomously detect obstacles.
[0012] Another object of the present disclosure is to provide an intelligent
wheelchair mobility device for the disabled that can autonomously detect obstacles and stop the wheelchair.
[0013] Another object of the present disclosure is to provide an intelligent
wheelchair mobility device that can monitor health parameters of user.
SUMMARY
[0014] Various aspects of the present disclosure relate to the field of
artificial intelligence. In particular, the present disclosure provides a wheelchair mobility system for handicapped humans.
[0015] According to an aspect, the present disclosure pertains to an
automatically controllable smart wheel chair device, the device comprising: a frame forming a body of the device; a voice capturing unit mounted on the frame, the voice capturing unit configured to capture a set of acoustic signals generated by a user, and correspondingly generate a first set of signals; an image capturing device mounted on the frame, the image capturing device configured to capture a set of images of eyes of the user, and correspondingly generate a second set of signals; a hand gesturing capturing glove comprising a set of sensors, the glove adapted to be worn by the user, and configured to monitor hand gestures of the user and correspondingly generate a third set of signals; a processing unit operatively coupled to the voice capturing unit, the image capturing unit, and the hand gesturing capturing glove, the processing unit comprising a processor operatively coupled to a memory storing instructions executable by the processor, and configured to: receive the first set of signals, the second set of signals, and the third set of signals,

and correspondingly generate a set of digital signatures; compare the generated
digital signatures with a set of predetermined digital signatures stored in a database,
to determine any or a combination of a voice command of the user, an eye
movement of the user, and hand gesture of the user; and generate a set of control
signals based on any of the determined voice command, the eye movement, and the
hand gesture, the set of control signals corresponds to one or more driving
instructions associated with wheel chair device; and a plurality of electrical motors
configured with at least one wheel of the device, and operatively coupled to the
processing unit, the plurality of electric motors configured to receive the generated
set of control signals, and correspondingly operate the device based on the one or
more driving instructions.
[0016] In an aspect, the processing unit comprises a Natural Language
Processing unit, and an image processing unit configured to process the first set of
signals, and the second set of signals, respectively.
[0017] In an aspect, the device comprises a communication unit operatively
coupled to the processing unit and configured to: communicate the device with a
cloud storage; and communicate the device with one or more mobile computing
devices associated with the user.
[0018] In an aspect, the device comprises one or more health sensors
operatively coupled to the processing unit, and configured to detect one or more
health parameters comprising temperature, heartbeat, and oxygen saturation levels
of the user.
[0019] In an aspect, the one or more health sensors comprises temperature
sensor, pulse sensor, and oxygen saturation sensors.
[0020] In an aspect, the one or more driving instructions comprise Move
Forward, Move Back, Stop, Turn Right, and Turn Left.
[0021] In another aspect, the device comprises a plurality of IR sensors
mounted on the wheel chair, and configured to detect one or more obstacles in a
path of the device, and correspondingly transmit a fourth set of signals, the
processing unit configured to generate a set of alert signals upon detection of at
least one obstacle in the path of the device, and stop the device.

[0022] In an aspect, the device comprises a joy stick mounted on the frame,
and operatively coupled to the processing unit to allow a user to operate the device.
[0023] In an aspect, the image capturing unit may be selected from a
camera, and an AI enabled camera, and wherein the voice capturing unit comprises a microphone.
[0024] Various objects, features, aspects and advantages of the inventive
subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The accompanying drawings are included to provide a further
understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0026] FIG. 1 illustrates an exemplary block diagram of the proposed
system in order to explain its working, in accordance with an exemplary embodiment of the present disclosure.
DETAILED DESCRIPTION
[0027] The following is a detailed description of embodiments of the
disclosure depicted in the accompanying drawings. The embodiments are in such details as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosures as defined by the appended claims.
[0028] Embodiments explained herein relate to the field of artificial
intelligence. In particular, the present disclosure provides a wheelchair mobility system for handicapped humans.

[0029] FIG. 1 illustrates an exemplary block diagram of the proposed
system in order to explain its working, in accordance with an exemplary embodiment of the present disclosure.
[0030] According to an embodiment, an automatically controllable smart
wheel chair device 100 that can include a frame forming a body of the device 100.
In an embodiment, the device 100 can include a voice capturing unit mounted on
the frame, the voice capturing unit configured to capture a set of acoustic signals
generated by a user, and correspondingly generate a first set of signals.
[0031] In another embodiment, the device 100 can include a hand gesturing
capturing glove adapted to be worn by the user. In an embodiment, the glove can include a set of sensors, which can be configured to monitor hand gestures of the user, and correspondingly generate a third set of signals.
[0032] In an embodiment, the device 100 can include an image capturing
device mounted on the frame, where the image capturing device can be configured to capture a set of images of eyes of the user, and correspondingly generate a second set of signals. In other embodiment, the image capturing unit can be selected from a camera, and an Al-enabled camera, and wherein the voice capturing unit can include a microphone.
[0033] In one embodiment, the device 100 can include a processing unit,
which can be operatively coupled to the voice capturing unit, the image capturing unit, and the hand gesturing capturing glove, and configured to receive the first set of signals, the second set of signals, and the third set of signals from the voice capturing unit, the image capturing unit, and the hand gesturing capturing glove, respectively. In an exemplary embodiment, the processing unit can include a processor operatively coupled to a memory 108 storing instructions executable by the processor. In another exemplary embodiment, a microcontroller 106 can be used as the processing unit.
[0034] In one embodiment, the processing unit can generate a set of digital
signatures taking into consideration the first set of signals, the second set of signals, and the third set of signals. In other embodiment, the processing unit can compare the generated digital signatures with a set of predetermined digital signatures stored

in a database, to determine any or a combination of a voice command 102 of the
user, an eye movement of the user, and hand gesture of the user.
[0035] In another embodiment, the processing unit can generate a set of
control signals based on any of the determined voice command 102, the eye
movement, and the hand gesture, the set of control signals corresponds to one or
more driving instructions associated with wheel chair device 100. In one
embodiment, the one or more driving instructions can include Move Forward,
Move Back, Stop, Turn Right, and Turn Left.
[0036] In yet another embodiment, the device 100 can include a plurality of
electrical motors configured with at least one wheel of the device 100, and
operatively coupled to the processing unit. Further, the plurality of electric motors
can be configured to receive the generated set of control signals, and
correspondingly can operate the device 100 based on the one or more driving
instructions.
[0037] In one embodiment, the processing unit can include a Natural
Language Processing (NLP) unit 104 (also, referred to as NLP system 104, herein)
that can be configured to process the first set of signals. In other embodiment, the
image processing unit can be configured to process the second set of signals.
[0038] In one embodiment, the device 100 can include a communication
unit operatively coupled to the processing unit, where the communication unit can
be configured to communicate the device 100 with a cloud storage. In another
embodiment, the communication unit can be configured to communicate the device
100 with one or more mobile computing devices associated with the user.
[0039] In an embodiment, the device 100 can include one or more health
sensors operatively coupled to the processing unit, and configured to detect one or
more health parameters, such as, but not limited to, temperature, heartbeat, and
oxygen saturation levels of the user. In an exemplary embodiment, the one or more
health sensors can include temperature sensor, pulse sensor, and oxygen saturation
sensors.
[0040] In an embodiment, the device 100 can include a plurality of IR
sensors 132 mounted on the wheel chair, and can be configured to detect one or

more obstacles in a path of the device 100, and correspondingly transmit a fourth set of signals. Further, the processing unit can be configured to generate a set of alert signals upon detection of at least one obstacle in the path of wheelchair 124, and stop the wheelchair 124.
[0041] In an embodiment, the device 100 can include a joy stick mounted
on the frame, and can be operatively coupled to the processing unit to allow a user to operate the device 100.
[0042] In an exemplary embodiment, the NLP system 104 can be activated
by the voice command 102 like Move Left, Move Right, Move forward, Move backward and stop. In an implementation, the voice command 102 can be converted into digital output by the NLP system 104. The digital output from the NLP system 104 can be fed into the processing unit. Further, the processing unit can communicate with a memory device which has mapped the digital output and the associated movement of the wheelchair 124, and correspondingly the processing unit can generate a first output.
[0043] In an embodiment, the first output from the processing unit can be
used by a control unit 110, where said output generated by the control unit 110 can be used to drive multiple motor drivers. Furthermore, the motor drivers can coordinate the performance of electric motors.
[0044] In an exemplary embodiment, the motor drivers might be used for
starting and stopping the electric motors, selecting forward or reverse rotation of
electric motors, selecting or regulating speed of electric motors, regulating or
limiting the torque of electric motors. In an implementation, the transmission device
like Reeves drive may connect the electric motors to the wheels of the wheel chair.
[0045] In one embodiment, the device 100 can include a left motor 116 that
can be driven by the control unit 110 through a motor driver 114, which may move
left wheel(s) 122 of the device 100. In other embodiment, the device 100 can
include a right motor 120 that can be driven by the control unit 110 through a motor
driver 118, which may move left wheel(s) 126 of the device 100.
[0046] In an embodiment, the device 100 can be trained to capture and
recognise hand gesture and eye movement 128 of the user through an AI system

130. In an exemplary embodiment, the Al-enabled camera can be configured to
capture movement of eye balls of the user. In an exemplary embodiment, AI
system, associated with said camera, can be initially trained on movement of eye
balls of one or more users. Further, the AI system can use pattern recognition upon
real time input of the movement of the eye balls, as captured by the AI camera,
inorder to make appropriate decision regarding direction of movement of the
wheelchair 124, and can correspondingly trigger a set of actuation signals.
[0047] Further, the set of actuation signals, from the AI system 130, can be
used by the control unit 110, which can correspondingly drive multiple motor drivers. Furthermore, the motor drivers can coordinate the performance of electric motors. In an exemplary implementation, the motor drivers can be used for any or a combination of starting and stopping the motor, selecting forward or reverse rotation, selecting or regulating speed, and regulating or limiting the torque. In another exemplary implementation, the transmission device can connect the electric motors to the wheels of the wheel chair.
[0048] In an implementation, the device 100 can also include brake(s) 112,
which can be controlled from top through any of the voice-command, hands, and movement of the eye balls.
[0049] In another exemplary embodiment, the hand gesturing capturing
glove with sensors may allow the hand gesture, and corresponding data can be
interpreted by the AI system 130 that can be configured to generate a digital output.
Further, the digital output of the AI system 130 can be used by the control unit 110,
and correspondingly the control unit 110 can drive multiple motor drivers. The
motor drivers may coordinate the performance of electric motors. The motor drivers
might be used for starting and stopping the motor, selecting forward or reverse
rotation, selecting or regulating speed, regulating or limiting the torque.
[0050] Furher, the Reeves drive can be configured to connect the electric
motors to the wheels of the wheel chair. The IR sensor 132 in the hand gloves can also be configured to keep a track of the duration of non activity of the hands. In an exemplary embodiment, if the non activity of the hands exceeds a predefined limit, the IR sensor 132 can activate a processor which may further communicate with

the memory 108 to fetch predefined emergency contact for the user. Moreover,
through the communicating device that may include, but not limite to, GSM and
CDMS technologies, the processor may inform the emergency contact.
[0051] In yet another exemplary embodiment, the sensors of the wheel chair
can be configured to detect body parameters. The sensors may be able to record
parameters, such as, body temperature, heart rate, Oxygen level, and the likes.
These parameters can be communicated to the processing unit which compares the
data from the sensors with the data in memory 108. If the difference between the
data from the memory 108 and the data from the sensors exceeds a predefined limit,
the processing unit may communicate with a predefined emergency contact number
stored in the memory 108 using a communicating device that may use GSM or
CDMS technologies. The parameters may also be stored in the cloud using the
communicating device that may use GSM or CDMS technologies.
[0052] Further, in yet another exemplary embodiment, the proximity
sensors installed at the wheel chair can be configured to detect a distance between
the wheelchair 124 and obstacles. When the proximity sensors detects obstacles
within a predefined range, it may send a signal to the processing unit. The
processing unit may further communicate with the memory 108 and generate an
output signal. The output signal from the processing unit can be transmitted to the
control unit 110, which can correspondingly drive multiple motor drivers. The
motor drivers may coordinate the performance of electric motors. The motor drivers
might be used for starting and stopping the motor, selecting forward or reverse
rotation, selecting or regulating speed, regulating or limiting the torque.
[0053] While the foregoing describes various embodiments of the
invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE PRESENT INVENTION
[0054] The present invention provides a system that allows physically
challenged people to use voice command, hand gesture, and/or eye movement to
control movement of the wheelchair.
[0055] The present invention provides a system that prevents accidents by
intelligently assessing the presence of obstacles in the path of a wheelchair and
stopping the wheelchair in case of an obstacle.
[0056] The present invention provides a system that keeps a record of the
health parameters of the user and stores it on the cloud.
[0057] The present invention provides a system that monitors deviation of
heath parameters from pre defined values in real time and alerts health care
professionals and /or designated contact in case the deviation exceeds a predefined
level.

We Claim:

1. An automatically controllable smart wheel chair device, the device
comprising:
a frame forming a body of the device;
a voice capturing unit mounted on the frame, the voice capturing unit
configured to capture a set of acoustic signals generated by a user, and
correspondingly generate a first set of signals;
an image capturing device mounted on the frame, the image
capturing device configured to capture a set of images of eyes of the user,
and correspondingly generate a second set of signals;
a hand gesturing capturing glove comprising a set of sensors, the
glove adapted to be worn by the user, and configured to monitor hand
gestures of the user and correspondingly generate a third set of signals;
a processing unit operatively coupled to the voice capturing unit, the
image capturing unit, and the hand gesturing capturing glove, the processing
unit comprising a processor operatively coupled to a memory storing
instructions executable by the processor, and configured to:
receive the first set of signals, the second set of signals, and the third set of signals, and correspondingly generate a set of digital signatures;
compare the generated digital signatures with a set of predetermined digital signatures stored in a database, to determine any or a combination of a voice command of the user, an eye movement of the user, and hand gesture of the user; and
generate a set of control signals based on any of the determined voice command, the eye movement, and the hand gesture, the set of control signals corresponds to one or more driving instructions associated with wheel chair device; and a plurality of electrical motors configured with at least one wheel of
the device, and operatively coupled to the processing unit, the plurality of

electric motors configured to receive the generated set of control signals, and correspondingly operate the device based on the one or more driving instructions.
2. The device as claimed in claim 1, wherein the processing unit comprises a Natural Language Processing unit, and an image processing unit configured to process the first set of signals, and the second set of signals, respectively.
3. The device as claimed in claim 1, wherein the device comprises a communication unit operatively coupled to the processing unit and configured to :
communicate the device with a cloud storage; and
communicate the device with one or more mobile computing
devices associated with the user.
4. The device as claimed in claim 1, wherein the device comprises one or more health sensors operatively coupled to the processing unit, and configured to detect one or more health parameters comprising temperature, heartbeat, and oxygen saturation levels of the user.
5. The device as claimed in claim 4, wherein the one or more health sensors comprises temperature sensor, pulse sensor, and oxygen saturation sensors.
6. The device as claimed in claim 1, wherein the one or more driving instructions comprise Move Forward, Move Back, Stop, Turn Right, and Turn Left.
7. The device as claimed in claim 1, wherein the device comprises a plurality of IR sensors mounted on the wheel chair, and configured to detect one or more obstacles in a path of the device, and correspondingly transmit a fourth set of signals, the processing unit configured to generate a set of alert signals upon detection of at least one obstacle in the path of the device, and stop the device.
8. The device as claimed in claim 1, wherein the device comprises a joy stick mounted on the frame, and operatively coupled to the processing unit to allow a user to operate the device.

9. The device as claimed in claim 1, wherein the image capturing unit is selected from a camera, and an AI enabled camera, and wherein the voice capturing unit comprises a microphone.

Documents

Application Documents

# Name Date
1 202111053353-STATEMENT OF UNDERTAKING (FORM 3) [19-11-2021(online)].pdf 2021-11-19
2 202111053353-POWER OF AUTHORITY [19-11-2021(online)].pdf 2021-11-19
3 202111053353-FORM FOR STARTUP [19-11-2021(online)].pdf 2021-11-19
4 202111053353-FORM FOR SMALL ENTITY(FORM-28) [19-11-2021(online)].pdf 2021-11-19
5 202111053353-FORM 1 [19-11-2021(online)].pdf 2021-11-19
6 202111053353-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-11-2021(online)].pdf 2021-11-19
7 202111053353-EVIDENCE FOR REGISTRATION UNDER SSI [19-11-2021(online)].pdf 2021-11-19
8 202111053353-DRAWINGS [19-11-2021(online)].pdf 2021-11-19
9 202111053353-DECLARATION OF INVENTORSHIP (FORM 5) [19-11-2021(online)].pdf 2021-11-19
10 202111053353-COMPLETE SPECIFICATION [19-11-2021(online)].pdf 2021-11-19
11 202111053353-Proof of Right [03-12-2021(online)].pdf 2021-12-03
12 202111053353-FORM 18 [18-08-2023(online)].pdf 2023-08-18
13 202111053353-FER.pdf 2025-03-10
14 202111053353-FORM 3 [09-06-2025(online)].pdf 2025-06-09
15 202111053353-FORM-5 [08-09-2025(online)].pdf 2025-09-08
16 202111053353-FER_SER_REPLY [08-09-2025(online)].pdf 2025-09-08
17 202111053353-DRAWING [08-09-2025(online)].pdf 2025-09-08
18 202111053353-CORRESPONDENCE [08-09-2025(online)].pdf 2025-09-08
19 202111053353-CLAIMS [08-09-2025(online)].pdf 2025-09-08
20 202111053353-ABSTRACT [08-09-2025(online)].pdf 2025-09-08

Search Strategy

1 202111053353E_11-03-2024.pdf