Sign In to Follow Application
View All Documents & Correspondence

An Artificial Intelligence Based Smart Cap With Voice Assistant For Blind Person Mobility

Abstract: AN ARTIFICIAL INTELLIGENCE BASED SMART CAP WITH VOICE ASSISTANT FOR BLIND PERSON MOBILITY ABSTRACT OF THE INVENTION People who are completely blind or have reduced eyesight confront numerous challenges while navigating. Every day, they face a variety of challenges, particularly in terms of mobility. Several systems have been developed to assist visually impaired people and improve their quality of life. This invention helps the blind to navigate independently using smart cap. It enables visually impaired people to see the world by voice assistance. It enables the blind and visually impaired to navigate freely by allowing them to experience their surroundings through audio output that describes the identified things. This invention aims to develop a smart cap for blind which will guide them from their source to destination. A smart cap with a GPS sensor to determine the exact location of its wearer, a map for blind people that will take them from point A to point B, and a way for blind individuals to direct their own course by observing their surroundings, any obstructions, water, or dark places in its path are detected by an ultrasonic sensor and vibration sensor will give vibrations when they found any obstacles in their path. The user receives all of the obtained information about the surrounding environment via voice assistant. Through the Alexa software module, blind persons can also engage with voice assistants to learn about their places and surroundings. It is programmed in Python. Through vocal contact, the blind can communicate with Alexa and achieve their goal without relying on others. Through the use of a smart cap, this invention will improve the mobility of blind people and give them the confidence to roam around freely

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
12 March 2022
Publication Number
11/2022
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
sksaruna@yahoo.co.in
Parent Application

Applicants

1. Dr. Aruna S K
Assistant Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road, Kanminike , Bengaluru-560074
2. Dr. Rakoth Kandan Sambandam
Assistant Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road,Kanminike, Bengaluru-560074
3. Dr. S. Thaiyalnayaki
Associate Professor, Department of Computer Science and Engineering, School of Computing, Bharath Institute of Higher Education and Research (Deemed to be University) , Chennai, Tamilnadu - 600126
4. Dr. Rekha V
Assistant Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road, Kanminike, Bengaluru-560074
5. Dr. Balamurugan M
Associate Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road, Kanminike, Bengaluru-560074
6. Dr. Naveen J
Assistant Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road, Kanminike, Bengaluru-560074
7. Mr. Mithun B N
Assistant Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road, Kanminike, Bengaluru-560074

Inventors

1. Dr. Aruna S K
Assistant Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road, Kanminike , Bengaluru-560074
2. Dr. Rakoth Kandan Sambandam
Assistant Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road,Kanminike, Bengaluru-560074
3. Dr. S. Thaiyalnayaki
Associate Professor, Department of Computer Science and Engineering, School of Computing, Bharath Institute of Higher Education and Research (Deemed to be University) , Chennai, Tamilnadu - 600126
4. Dr. Rekha V
Assistant Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road, Kanminike, Bengaluru-560074
5. Dr. Balamurugan M
Associate Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road, Kanminike, Bengaluru-560074
6. Dr. Naveen J
Assistant Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road, Kanminike, Bengaluru-560074
7. Mr. Mithun B N
Assistant Professor, Department of Computer Science and Engineering, School of Engineering and Technology, CHRIST(Deemed to be University) , Kengeri Campus, Mysore Road, Kanminike, Bengaluru-560074

Specification

Claims:WE CLAIMS

1. A GPS sensor, a water sensor, an ultrasonic sensor, a microcontroller, a map, and an Alexa module are all embedded in a smart cap.
2. According to claim 1, a GPS sensor is employed to determine the exact location of the wearer, which in this case is blind individuals.
3. Microcontroller is utilized to process the input received from sensors, according to Claim 1.
4. In accordance with Claim 1, A map is included to help you navigate your way from source to destination.
5. According to Claim 1, a water sensor is integrated into a smart cap to detect water in front of blind individuals.
6. Ultrasonic sensor is utilized to identify any barriers, water, or dark regions in the way of a visually impaired individual, according to Claim 1.
7. According to Claim 1, Alexa is integrated into a smart headgear for blind individuals to use as a voice assistant. , Description:FIELD OF THE INVENTION

The current invention relates to a blind navigation aid device that uses motion detection, image recognition, and voice recognition. It allows the blind and visually impaired to navigate freely by allowing them to experience their surroundings.

BACKGROUND OF THE INVENTION

US8606316B2 - A blind aid device includes features such as allowing a blind person to activate the device; collecting one or more photos related to a blind person's surrounding environment; and capturing one or more images related to a blind person's surrounding environment. recognizing moving objects in one or more recorded photos; determining a limited set of spatial relationships between moving objects; analysing one or more images in the blind aid device to classify a finite number of spatial relationships related to moving objects corresponding to predefined moving object data; converting select spatial relationship information related to the one or more analysed images into audible information; relaying select audible information to the blind person; and notifying the blind person of one or more occurrences predetermined by the blind person as actionable.

US6298010B1 - The invention is a device for blind and visually impaired people that includes at least one contactless distance measurement system that delivers correction variables based on the distance between the device and an item identified by the distance measurement system. At least one indicating device is additionally included, which is impacted by the distance measuring system's correcting variable and provides an indication based on the correcting variable. At least one tactile indicator is shifted continuously or quasi-continuously along a tactile path as a monotone function of the distance recorded by the distance measurement system in the indicating device. An adjustable element, for example, might be included in the tactile indicator. An adjustable element, for example, might be included in the tactile indicator. By touching the position of the displaceable element, the user can feel the distance measured.

US20150254943A1- A cane, a boot, a headband, and a central control device are all part of the multisensor system for the blind. One or more sensors may be included in the cane, boot, or headband to detect barriers in the user's path. Each sensor can be set to detect obstructions at a level that corresponds to the height at which it is worn by the user. The sensors may send signals to the central control unit if an impediment is detected. The signals may be processed by the central control device, which then activates an appropriate aural device to notify the user of the obstruction's presence and/or distance.

CN104039293B- A type of electronic walking device for visually impaired and blind people's real-time navigation is designed to carry out the tiny physical interface of real-time navigation to real-time conditions and hardware without the use of a digital camera. Importantly, the audio messaging must be tied to real-time conditions in an electronic walking instrument and kept in flash memory for use in assisting visually impaired and blind people. Electronic walking instruments combine wireless and cable technologies to give an affordable, sturdy, stable, and simple-to-use scheme for visually impaired and blind persons.

Qualcomm constructed a Smart Cap is a visually challenged helper that uses photos from a webcam to narrate the description of a scenario. Smart Cap seeks to give users with this missing experience by employing Microsoft Cognitive Services' state-of-the-art deep learning techniques for image categorization and labelling. The Smart Cap intends to provide the vision handicapped with a story of the environment. This storey is created by turning the scenes in front of the person into text that summarizes the scene's key elements. "A group of people playing baseball," "a yellow truck parked next to the car," and "a bowl of salad on a table" are examples of text. The system's original prototype would play one line along with some keywords as audio to the user, but later versions would include a complete description as a feature.

PRIOR ART SEARCH

US20180134217A1- Vehicle vision system with blind zone displays and alert system: 2016-05-06
US9037400B2- Virtual walking stick for the visually impaired: 2014-12-25
US5806017A- Electronic auto routing navigation system for visually impaired persons: 1998-09-08
US6671226B1 -Ultrasonic path guidance for visually impaired: 2003-12-30.
US7039522B2- System for guiding visually impaired pedestrian using auditory cues: 2006-05-02
US20060129308A1- Management and navigation system for the blind: 2006-06-15
US20060293839A1 -System, method and apparatus for providing navigational assistance: 2006-12-28
US7267281B2 -Location, orientation, product and color identification system for the blind or visually impaired: 2007-09-11
US20120268563A1 -Augmented auditory perception for the visually impaired: 2012-10-25.
US20110092249A1- Portable blind aid device: 2011-04-21.
US8606316B2- Portable blind aid device: 2011-04-21
US4712003A- Blind person guide device: 1987-12-08
US5487669A -Mobility aid for blind persons: 1996-01-30.
US6298010B1 -Orientation aid for the blind and the visually disabled: 2001-10-02.
US6774788B1- Navigation device for use by the visually impaired: 2004-08-10
JP6267697B2-Blind riveting device and method: 2015-09-24.
US9692939B2-Device, system, and method of blind deblurring and blind super-resolution utilizing internal patch recurrence: 2014-12-04.
CN112767250B- Video blind super-resolution reconstruction method and system based on self-supervision learning: 2021-10-15.
CN105496740B-A kind of intelligent blind-guiding device and the blind-guiding stick for being provided with the device: 2018-02-02.
CN105167967B-A kind of blind-guiding method and system: 2018-04-03.
US20150254943A1-Multisensor system for the blind: 2015-11-10.
KR100385890B1-The cap which has a barrier sensor for the blind: 2003-06-02.
US9909893B2-Intelligent blind guiding device: 2018-03-06.
CN104039293B-A kind of visually impaired people and the electronic walk helper of blind person: 2017-06-20.
US8727180B2-Smart cap system: 2013-08-08

REFERENCES:

1. Duraisamy et.al, “Talking Smartcap for Visually Impaired Person”, International Journal of Innovative Research in Science, Engineering and Technology, Volume 9, Issue 8, August 2020
2. Shruti Dambhare et.al, “Smart cap for Blind: Obstacle Detection, Artificial vision and Realtime assistance via GPS
3. Yogesh Rohilla, “ultrasonic Sensor based Smart Cap as Electronic Travel Aid for Blind People”, Conference: 2020 Third International Conference on Smart Systems and Inventive Technology (ICSSIT), October 2020.
4. Pasika S Ranaweera et.al, “Electronic travel aid system for visually impaired people”, Conference: 2017 5th International Conference on Information and Communication Technology, May 2017.
5. R. R. A. Bourne et al., "Magnitude, temporal trends, and projections of the global prevalence of blindness and distance and near vision impairment: a systematic review and meta-analysis,” The Lancet Global Health, vol. 5, no. 9, pp. e888-e897, 2017/09/01/ 2017.
6. M. M. Apple, "Kinesic training for blind persons: A vital means of communication," Journal of Visual Impairment & Blindness, vol. 66, no. 7, pp. 201-208, 1972.
7. S. Cornell Kärnekull, A. Arshamian, M. E. Nilsson, and M. Larsson, "The effect of blindness on long-term episodic memory for odors and sounds," Frontiers in psychology, vol. 9, p. 1003, 2018.
8. A. Nishajith,”Smart Cap - Wearable Visual Guidance System for Blind”, 2018 International Conference on Inventive Research in Computing Applications (ICIRCA), 11-12 July 2018.
9. Hanen J Abnoun, IF Aouzi, Benzarti I Hamid and I Amiri, "Object Detection and Identification for Blind People in Video Scene", 2015 15th International Conference on Intelligent Systems Design and Applications (ISDA).
10. B. Durette, N. Louveton, D. Alleysson and J H'erault, "Visuoauditory sensory substitution for mobility assistance: testing The VIBE", Workshop on Computer Vision Applications for the Visually Impaired, 2008.
11. A. F. R. Hern'andez et al., "Computer Solutions on Sensory Substitution for Sensory Disabled People", Proceedings of the 8th WSEAS International Conference on Computational Intelligence, pp. 134-138, 2009.
12. S. Kammoun et al., "Navigation and space perception assistance for the visually impaired: The NAVIG project", IRBM Numro spcial ANR TECSAN, vol. 33, no. 2, pp. 182-189, 2012.
13. F.G. Brian Katz et al., "NAVIG: Guidance system for the visually impaired using virtual augmented reality", Technology and Disability, pp. 163-178, 2012.

OBJECTIVES OF THE INVENTION
• To increase navigation and orientation of visually impaired people
• To provide a safe mobility for blind people.
• To detect and locate the objects in order to offer those people with sense of the external environment.
• To provide visual information about surrounding through Alexa module.
• It can detect any obstacle with the help of ultrasonic sensors.
• It can provide correct location of obstacle by using the GPS

SUMMARY OF THE INVENTION

Vision is one of the most important human senses, and it plays an important role in how people perceive their surroundings. Some technical solutions have recently been developed to help blind individuals navigate freely. To establish the blind person's location and orientation, many of these systems rely on embedded technology. This concept uses an Alexa module as a voice assistant for visually impaired people. They can utilize the voice assistant tool to look around and see what's going on. They will inquire about the surrounding area using this Alexa module. The majority of visually impaired people have trouble navigating freely in a space. The solution is to design an assistive wearable headgear for the blind or visually impaired. The proposed solution is a wearable 'Smart Cap,' which aids persons with vision problems in communicating with others and navigating to safety. This approach is straightforward for those who live alone. The purpose of our idea is to construct a headpiece for blind individuals that will guide them from source to destination.
People who are completely blind or have reduced eyesight confront numerous challenges while navigating. Every day, they face a variety of challenges, particularly in terms of mobility. Several systems have been developed to assist visually impaired people and improve their quality of life. This invention helps the blind to navigate independently using smart cap. It enables visually impaired people to see the world by voice assistance. It enables the blind and visually impaired to navigate freely by allowing them to experience their surroundings through audio output that describes the identified things. This invention aims to develop a smart cap for blind which will guide them from their source to destination. A smart cap with a GPS sensor to determine the exact location of its wearer, a map for blind people that will take them from point A to point B, and a way for blind individuals to direct their own course by observing their surroundings, Any obstructions, water, or dark places in its path are detected by an ultrasonic sensor. The user receives all of the obtained information about the surrounding environment via voice assistant. Through the Alexa software module, blind persons can also engage with voice assistants to learn about their places and surroundings. It is programmed in Python. Through vocal contact, the blind can communicate with Alexa and achieve their goal without relying on others. Through the use of a smart cap, this invention will improve the mobility of blind people and give them the confidence to roam around freely.

BRIEF DESCRIPTION OF THE INVENTION

Blind people have a difficult time navigating the world. It becomes difficult and, in some cases, dangerous to walk from one place to another. Although walking canes and seeing-eye dogs can help you avoid specific hazards, they don't address the larger problem of navigation and situational awareness. Reading signage and written materials presents additional challenges. Blind people, particularly those who live in cities, rely heavily on their hearing talents to understand what is going on around them.

A Smart Cap
A smart cap featuring GPS, Ultrasonic Sensor, Water Sensor, Microcontroller, Alexa Module, and Map was created by this invention. The microcontroller analyses the sensor inputs. When the blind encounter any impediments on their way to their destination, it will send an alarm to Alexa.
Ultrasonic sensor

It sends out a high-frequency sound pulse and then calculates the time it takes for the sound echo signal to return. It detects barriers in the environment of a visually impaired individual. Visually impaired people's mobility and safety can be improved by obstacle detection and warning, especially in unfamiliar environments. The distance between the visually impaired person and a barrier is calculated using an ultrasonic sensor. If the computed distance falls outside of the specified range, Alexa will issue an alarm.

Water Sensor:
It is used for detecting small water puddles. When the module comes in contact with water, the connected alexa to the microcontroller produces a voice sound like be safe water is on your path to the user. It is mainly used here for sensing rainfall, water level, and even liquid leakage when blind mobility in outside.

GPS Sensor
GPS can provide turn-by-turn directions to the nearest coffee shop, shoe store, or burger restaurant, increasing the confidence and independence of visually impaired people when out and about. The GPS gadget sensor gets a time-stamped signal from numerous satellites and triangulates and calculates your position using the difference between these times. In most cases, a standalone device relies solely on GPS to determine your location. To boost accuracy, a smartphone GPS receiver can use known positions of phone cell towers and constantly updated maps of Wi-Fi radio hotspots.

Alexa Module
In this idea, the Alexa module is used as a voice assistant. It's not only about giving users verbal assistance. The tech giant's AI can now use its vision to help blind and low vision consumers identify products using an Echo Show, according to a new function called "Show and Tell."

Documents

Application Documents

# Name Date
1 202241013502-COMPLETE SPECIFICATION [12-03-2022(online)].pdf 2022-03-12
1 202241013502-REQUEST FOR EARLY PUBLICATION(FORM-9) [12-03-2022(online)].pdf 2022-03-12
2 202241013502-DRAWINGS [12-03-2022(online)].pdf 2022-03-12
2 202241013502-FORM-9 [12-03-2022(online)].pdf 2022-03-12
3 202241013502-FORM 1 [12-03-2022(online)].pdf 2022-03-12
4 202241013502-DRAWINGS [12-03-2022(online)].pdf 2022-03-12
4 202241013502-FORM-9 [12-03-2022(online)].pdf 2022-03-12
5 202241013502-COMPLETE SPECIFICATION [12-03-2022(online)].pdf 2022-03-12
5 202241013502-REQUEST FOR EARLY PUBLICATION(FORM-9) [12-03-2022(online)].pdf 2022-03-12