Sign In to Follow Application
View All Documents & Correspondence

Voice Command Activated Power Wheelchair

Abstract: Most of the conventional wheelchairs use the joystick as the user input mode. The drawback of joystick control is that it is not suitable for physically disabled person who cannot control the movement especially with hands. We presented an voice command activated power wheelchair whose motion can be controlled by the user"s voice. It can recognize the user"s voice and can move according to the end destination made by the user. To those people who are challenged physically they need to take the entire command either through manually or joystick of the wheelchair till it reaches the destination. This project is indeed to help the person with disability to move without external guidance with more automation to the wheelchair. We contributed in enhancing the automation to the system that can learn all about the locations in a given building, and then take its occupant to a given place in response to a verbal command. Just by saying "go to my room," the wheelchair user would be able to avoid the need for controlling every twist and turn of the route and could simply sit back and relax as the chair moves from one place to another based on a map stored in its memory. The modules used are microcontroller hardware (p89v664), IR transmitter and receiver, servo motor interface, ultrasonic obstacle detector, voice recognition kit. Thus the wheelchair is user-friendly and cost effective.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
23 November 2010
Publication Number
51/2010
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application

Applicants

THE PRINCIPAL
ST.JOSEPH'S COLLEGE OF ENGINEERING AND TECHNOLOGY, A.S. NAGAR, ELUPATTI, THANJAVUR - 613 403.

Inventors

1. MRS. DEVI
HEAD OF THE DEPARTMENT ECE, ST.JOSEPH'S COLLEGE OF ENGINEERING AND TECHNOLOGY, A.S. NAGAR, ELUPATTI, THANJAVUR - 613 403.
2. MR. M. KARTHIKEYAN
STUDENT, DEPARTMENT ECE, ST. JOSEPH'S COLLEGE OF ENGINEERING AND TECHNOLOGY, A.S. NAGAR, ELUPATTI, THANJAVUR - 613 403.

Specification

Field of the Invention

In the present invention relates to a system for controlling the operation of power driven equipment, particularly motorized wheel chairs, by voice commands.

Background of the Invention

Most of the conventional electric powered wheelchairs are using joystick as a used input mode of control to maneuver the powered wheelchairs. The drawback of joystick control is that it is not suitable for pbysically disabled person who cannot control their movements especially the hands. The proposed voice-activated powered wheelchair supplementary with joystick control allows physically disabled person to maneuver the wheelchair easily without the need to use hands.

Objective of the Invention

Voice command activated power wheelchair requires safety manipulations with high speech recognition accuracy because accidents may occur due to misrecognition. The system requires a guarantee of safety for wheelchair users in two additional conditions. To move only in response to the disabled persons own voice command. To reject non-voice command input.

Summary of the Invention

Developing an autonomous wheelchair that can learn all about the locations in a given building, and then take its occupant to a given place in response to a single verbal command is the objective of the project. By doing so the following problems will be avoided the wheelchair user need not strain and use manually his hands to move the wheelchair. The person does not need anyone's help to assist him in moving around the area. Can reach the destination location in a single voice command.The wheelchair user would be able to avoid the need for controlling every twist and turn of the route. Can simply sit back and relax as the chair moves from one place to another based on a map stored in its memory.

Step 1:
Regulating the power supply to individual devices from a common source Interfacing the motor through the driver and executing code for movement.

Step 2:
Adding sensors to identify the location and detect obstacle presence. At this stage command is given through keypad and providing intelligence to take turns on its own in its path.

Step 3:
Replacing the keypad with voice recognition module which serves as input.

A development board which has the IC-p89v664 has to be interfaced to the driver circuit. Circuit has to be designed and has to be implemented in PCB to generate 5V- 12 V to power the motor controller. Position Sensor circuits are to be designed and implemented in pcb. Keypad has to be interfaced and command is to be made to pass through the keypad. The microcontroller plays the major role in the execution. The inputs to the controller are: a) Building layout, b) Current location, c) User command, d) Bi- directional serial communication using FT232R.

All these inputs are given to the microcontroller which is programmed to get the desired output. A driver circuit was implemented to interface the microcontroller to the DC motor.

BUILDING LAYOUT

Mapping of the building manually was done. Giving the intelligence of the building layout to the controller memory is the first stage. Entire area is fragmented into imaginary squares which has the same dimensions as the wheelchair. Every room has a default position where the wheelchair rests. To move from one room to the other, the wheelchair first navigates to the exit of that room. The next step is to go to the destination location. Shortest path through the segment is mapped to the controller memory to reach the destination.

USER COMMAND

The input user command is in the form of voice given to the voice recognition module. The module is capable of recognizing 16 unique commands and a definite digital output is generated for each command. Voice commands will be picked by the microphone and serves as input to the controller. Signals will undergo certain process, Filtration of noise, Amplification Getting the right type of signal

VARIOUS PROCESSES FOR RECOGNITION The chip has two operational modes; manual mode and CPU mode. The CPU mode is designed to allow the chip to work under a host computer. This circuit operates in the manual mode. The manual mode allows one to build a standalone speech recognition board that doesn't require a host computer and may be integrated into other devices to utilize speech control.

VOICE RECOGNITION MODULE The voice recognition system is a completely assembled system, in the sense the words can be trained which the circuit has to recognize. It has 8bit data out which can be interfaced with the microcontroller. The heart of the circuit is the HM2007 speech recognition IC. The IC can recognize 20 words, each word a length of 1.92 seconds. The keypad, microphone and digital display are used to communicate with and program the HM2007 chip. The system capable of achieving 90% accuracy for word recognition.

TRAINING WORDS FOR RECOGNITION Press "1" (display will show "01" and the LED will turn off) on the keypad, then press the TRAIN key (the LED will turn on) to place circuit in training mode, for word one. Say the target word into the onboard microphone (near LED) clearly. The circuit signals acceptance of the voice input by blinking the LED off then on. The word (or utterance) is now identified as the "01" word. If the LED did not flash, start over by pressing "1" and then "TRAIN" key. You may continue training new words in the circuit. Press "2" then TRN to train the second word and so on. The circuit will accept and recognize up to 20 words (numbers 1 through 20). It is not necessary to train all word spaces. If you only require 10 target words that all you need to train.

TESTING RECOGNITION: Repeat a trained word into the microphone. The number of the word should be displayed on the digital display. For instance, if the word "directory" was trained as word number 20, saying the word "directory" into the microphone will cause the number 20 to be displayed.

ERROR CODES: The chip provides the following error codes. 55 = word to long, 66 = word to short, 77 = no match.

CLEARING MEMORY To erases all words in memory press "99" and then "CLR". The numbers will quickly scroll by on the digital display as the memory is erased.

CHANGING & ERASING WORDS Trained words can easily be changed by overwriting the original word. Simply retrain the word space by pressing "6" then the TRAIN key and saying the word into the microphone. If one needs to erase the word without replacing it with another word press the word number then press the CLR key, Word six is now erased.

Recognition Style
In addition to the speaker dependent/independent classification, speech recognition also contends with the style of speech it can recognize. They are three styles of speech: isolated, connected and continuous.

Isolated Words are spoken separately or isolated. This is the most common speech recognition system available today. The user must pause between each word and command spoken.

Connected: This is a half way point between isolated word and continuous speech recognition. It permits users to speak multiple words. The HM2007 can be set up to identify words or phrases 1.92 seconds in length. This reduces the word recognition dictionary number to 20.

Interfacing external circuits through data bus This sample project will show how a circuit can be interfaced through the data bus of speech recognition circuit. It will show messages and error codes on LCD. It will also operate four relays as per data from speech circuit.

SOFTWARE TOOLS SDCC SDCC (Small Device C Compiler) is a Free and Open Source, re-targetable, optimizing ANSI-C compiler. It is a cross compiler.

FLASH MAGIC From the Flash Magic website, Flash Magic is a PC tool for programming flash based microcontrollers from NXP using a serial or Ethernet protocol while in the target hardware.

PROCESS OF FILE COMPILATION AND EXECUTION
I. Save the file in notepad with .C extension
II. Go to command prompt and reach the target file location
III. Pre compile the individual files using the command sdcc -c
IV. Compile THE main file with the pre compiled files
V. This will generate executable files with .ihx extension
VI. through Flash Magic load the executable files into the controller.

The prototype of an autonomous wheelchair that can traverse its own path through the given layout was designed and constructed. The wheelchair moves with single voice command which serves as input. We contributed in enhancing the automation to the system that can learn all about the locations in a given building, and then take its occupant to a given place in response to a verbal command. The wheelchair user would be able to avoid the need for controlling every twist and turn of the route and could simply sit back and relax as the chair moves from one place to another based on a map stored in its memory.

Brief Description of the Drawing

Figure Block Diagram of Voice Command activated power wheel chair.
The Figure Contains following particulars having then they are (1) Microcontroller (128) (2) Driver (111) (3) DC motor (126) (4) robotic wheel chair (136) (5) Destination location (125) (6) Serial communication (130) (7) User command (121) (8) Current location (122) (9) Building layout (112) (10) obstacle detector (123) (11) LCD Display(124).
Figure 2.Position of sensors
The intelligence on current location is given to the wheelchair by directions north, south, east and west. Another indication to the current location is the output from the position sensors. Three position sensors are placed in the right side, left side and at the back. These sensors give output depending on the presence of wall as either 0 or 1. If the wall is present it gives the output as 1 and if not as 0.
The Figure Contains following particulars having then they are (1) left side position sensor (101) (2) Right side position sensor (102) (3) Back side position sensor (103).
Figure 3.Current location and detection
Here the wheelchair is in segment no. 1 facing north direction, As it moves from one segment to the other the direction and position sensor outputs keeps changing. Thus the wheelchair traces the path and keeps moving .The Number represents Room 1-25.
Figure 4. Direction Detection logic Table
The input user command is in the form of voice given to the voice recognition module. The module is capable of recognizing 16 unique commands and a definite digital output is generated for each command. Voice commands will be picked by the microphone and serves as input to the controller. Signals will undergo certain process are Filtration of noise, amplification .The table represents hence the detection would be as follows : Segment Number represent room location,Direction - North,East,West,South.,RS = (Right Sensorz)LS= (left sensor),BS=(back sensor). As it moves fi-om one segment to the other the direction and position sensor outputs keeps changing. Thus the wheelchair traces the path and keeps moving
Figure S.Biock Diagram micro controller board
The Figure Contains following particulars having then they are (1) Micro controller (300) (2) Port p0/p4 20 pin header (301) (3) port pl/p3 20 pin header (302) (4) EXITO 4 pin header (304) (5) EEPROM (305) (6) SPI Bus (306) (7) Buzzer (307) (8) PWM (308) (9) Keypad (309) (10) Push Button (310) (11) INT (311) (12) LEDs (312) (13) LCD (313) (14) USb to Serial (314) (15) Usb connector(315) (16) SIO/I2c 10 pin header (316) (17) RTC (317).
Figure 6.Front view of the board
The Figure Contains following particulars having then they are (a) EXTIO Phoenix Terminal (360) (b) PWR Phoenix Terminal (361) (c) LCD (362) (d) Buzzer (363) (e) Power Jack (364) (f) Usb Connector (365) (g) EXTPWR/USBPWR Jumper(367) (h) Interrupt Key (368) (i) Keypad (369) G) USBSIOS Switch (370) (k) INTR/BUZZ switch(371) (k) Reset (372) (1) port pl/p3 20 pin header (373) (m) Port p0/p4 20 pin header (374) (n) SI0/I2C 10 pin header (375)
Figure T.Obstacle Detection
The Figure Contains following particulars having then they are (a)Robot (354) (b) object (355) (c) Ultrasonic wave (356).
Figure 8. Functional Block Diagram of Voice Recognition
The Figure Contains following particulars having then they are (a) Mic (324) (b) A/D (325) (c) speech processor (326) (d) I/O Controller (327) (e) Mic - amplifier (328) (f) Inputs (329) (g) outputs (330)
Figure 9.Flow Chart for voice activate intelligent wheel chair
The method for voice activate intelligent wheel chair
Step 1: Include the library files for 8051 controller Step 2: Declare the Ports assigned Step 3: Allocate the variables used according to their bit size Step 4: Initialize the destination pts and segment pts Step 5: Start the main part Step 6: Read the direction and segment pts of the wheelchair Step 7: Display ENTER COMMAND Step 8: Get the command from the user and display it Step 9: Go into switch case Step 10: Read the current segment point and compare with the destination point. Step 11: While no obstacle compare the current pt with destination pt Step 12: If the target is within the row move within row Step 13: If the target is out of the range move to next row Step 14: Update direction and
segment points Step 15: Go to Step 11. Step 16: If obstacle is detected display: OBSTACLE DETECTED Step 17: Wait until it is cleared if time lapsed take alternative path Step 18: If obstacle is cleared within lap time display: OBSTACLE CLEARED Step 19: Continue the loop until current segment equal to destination segment else go to Step 1 Step 20: Display: TARGET REACHED Step 21: Wait for user command and go to Step6.

Claims:

We Claim that

1. The wheel chair that can travel is its own path through the given layout.

2. Where in above said wheel chair moves with single voice common, talking voice command as an input.

3. Where in the system and automation can understand through voice command about the given building location.

4. The map of the location and building can be stored can be activated through voice command.

5. The system follows the map is its memory with reference to the command receipt.

Documents

Application Documents

# Name Date
1 3527-che-2010 claims 23-11-2010.pdf 2010-11-23
1 3527-CHE-2010-AbandonedLetter.pdf 2018-01-19
2 3527-CHE-2010-FER.pdf 2017-07-11
2 3527-che-2010 power of attorney 23-11-2010.pdf 2010-11-23
3 3527-che-2010 form-9 23-11-2010.pdf 2010-11-23
3 3527-CHE-2010 CORRESPONDENCE OTHERS 23-11-2010.pdf 2010-11-23
4 3527-che-2010 form-2 23-11-2010.pdf 2010-11-23
4 3527-che-2010 description(complete) 23-11-2010.pdf 2010-11-23
5 3527-che-2010 drawings 23-11-2010.pdf 2010-11-23
5 3527-che-2010 form-18 23-11-2010.pdf 2010-11-23
6 3527-che-2010 abstract 23-11-2010.pdf 2010-11-23
6 3527-che-2010 form-1 23-11-2010.pdf 2010-11-23
7 3527-che-2010 abstract 23-11-2010.pdf 2010-11-23
7 3527-che-2010 form-1 23-11-2010.pdf 2010-11-23
8 3527-che-2010 drawings 23-11-2010.pdf 2010-11-23
8 3527-che-2010 form-18 23-11-2010.pdf 2010-11-23
9 3527-che-2010 description(complete) 23-11-2010.pdf 2010-11-23
9 3527-che-2010 form-2 23-11-2010.pdf 2010-11-23
10 3527-che-2010 form-9 23-11-2010.pdf 2010-11-23
10 3527-CHE-2010 CORRESPONDENCE OTHERS 23-11-2010.pdf 2010-11-23
11 3527-CHE-2010-FER.pdf 2017-07-11
11 3527-che-2010 power of attorney 23-11-2010.pdf 2010-11-23
12 3527-CHE-2010-AbandonedLetter.pdf 2018-01-19
12 3527-che-2010 claims 23-11-2010.pdf 2010-11-23

Search Strategy

1 reeeeeeeeeeeee_15-05-2017.pdf