Abstract: Disclosed is a navigational aid device for visually impaired individuals comprising: a camera configured for real-time capture and analysis of surrounding imagery using object detection algorithms to identify and classify obstacles, including but not limited to, people, animals, and objects; an audio feedback mechanism to provide auditory signals indicating the presence and location of detected obstacles relative to the user’s path; a vibration alert system integrated with ultrasonic sensors for notifying the user of low-height obstacles; an audio guidance system with customizable volume and language settings, offering directional cues based on detected obstacles and user orientation; and a connectivity module for linking with a GPS-enabled mobile application, allowing real-time location tracking and access to device settings, software updates, and user manuals. Fig. 1 Drawings / FIG. 1 / FIG. 2 / FIG. 3
Description:Field of the Invention
The present disclosure relates to assistive technology for individuals with visual impairments, particularly, to a navigational aid device for visually impaired individuals.
Background
The background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
Navigational aids for visually impaired individuals are critical for enhancing their safety and independence. Traditional methods such as the use of canes, guide dogs, and tactile paving have provided basic means for navigating environments. Canes are widely used for immediate obstacle detection and ground texture understanding but offer limited information about the environment. Guide dogs offer companionship and can navigate around obstacles, yet they require extensive training and are not a viable solution for all individuals. Tactile paving provides guidance and warning of hazards in public spaces, but its installation is limited to certain areas, leaving many environments challenging to navigate.
In recent years, technology-based solutions have sought to address these limitations through the development of electronic travel aids (ETAs). These devices use various forms of technology, such as ultrasonic sensors, GPS, and computer vision, to detect obstacles and provide spatial information to the user through audio feedback or haptic signals. Ultrasonic ETAs emit sound waves to detect obstacles and convey distance information via vibrations or sounds, allowing users to avoid obstacles. GPS-based ETAs offer navigation assistance outdoors by providing directions to the user through audio feedback. Computer vision-based systems utilize cameras to identify and inform about obstacles and environmental features, offering a more comprehensive understanding of the surroundings.
Despite advancements, each of these solutions has its drawbacks. Ultrasonic sensors may not accurately detect drop-offs or overhanging obstacles, potentially leading to navigation errors. GPS systems are less effective indoors or in densely built-up areas where satellite signals are weak or obstructed. Computer vision technologies, while promising, require significant computational resources and can be challenged by variable lighting conditions and complex environments.
Furthermore, the integration of these technologies into a cohesive system that is practical, reliable, and user-friendly remains a significant challenge. Issues such as device portability, ease of use, battery life, and cost also need to be addressed to make advanced navigational aids widely accessible to visually impaired individuals.
In light of the above discussion, there exists an urgent need for solutions that overcome the problems associated with conventional systems and techniques for providing real-time information and guidance to visually impaired individuals, ensuring a safe and effective journey in various areas.
Summary
The following presents a simplified summary of various aspects of this disclosure in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical elements nor delineate the scope of such aspects. Its purpose is to present some concepts of this disclosure in a simplified form as a prelude to the more detailed description that is presented later.
The following paragraphs provide additional support for the claims of the subject application.
A navigational aid device for visually impaired individuals has been developed, integrating various technologies to enhance mobility and safety. The device incorporates a camera designed for the real-time capture and analysis of surrounding imagery. Utilizing object detection algorithms, the camera is capable of identifying and classifying various obstacles, including people, animals, and inanimate objects. In conjunction with the camera, an audio feedback mechanism provides auditory signals to the user. These signals communicate the presence and location of detected obstacles, offering a critical layer of situational awareness.
In an embodiment, a vibration alert system is integrated with ultrasonic sensors. This system notifies the user of low-height obstacles that may not be detected by the camera or may require immediate attention to avoid tripping or collisions. This feature is particularly important for detecting hazards that are not within the line of sight or are too low to be captured by the camera's field of view.
In an embodiment, the device includes an audio guidance system equipped with customizable volume and language settings. This system offers directional cues and navigational assistance based on the location and type of detected obstacles, as well as the user’s orientation. Such a system allows for a more personalized and accessible navigation experience, catering to the diverse needs and preferences of users.
In an embodiment, a connectivity module enables linkage with a GPS-enabled mobile application. This feature allows for real-time location tracking and provides users with access to various functionalities, including device settings, software updates, and user manuals. The integration of the connectivity module ensures that users can maintain up-to-date software and access important information about their device, enhancing the overall user experience.
In an embodiment, the camera employs advanced machine learning techniques to improve the accuracy of obstacle identification and classification. By leveraging the latest advancements in artificial intelligence, the camera is able to offer more reliable and precise detection of potential hazards, thereby increasing the safety and effectiveness of the navigational aid.
In an embodiment, the device is further refined by the inclusion of an ergonomic handle. This design feature aims to provide comfort and ease of use for extended periods, addressing the physical demands of prolonged device operation and ensuring that users can navigate their environment with minimal discomfort.
In an embodiment, the audio feedback mechanism incorporates a speaker system capable of adjusting its output based on ambient noise levels. This adaptability ensures that auditory signals remain clear and discernible even in noisy environments, facilitating effective communication of important navigational information.
In an embodiment, the vibration alert system is enhanced to vary the intensity of vibrations based on the proximity of detected low-height obstacles. This feature allows for a more intuitive understanding of immediate surroundings, offering a nuanced approach to alerting users about potential hazards.
In an embodiment, the audio guidance system is expanded to provide navigational cues through bone conduction headphones. This alternative method of auditory feedback is particularly beneficial for users with varying degrees of hearing impairment, offering a more inclusive solution that accommodates diverse user needs.
In an embodiment, the connectivity module utilizes low energy Bluetooth technology. This selection is aimed at maximizing the device’s battery life while ensuring a stable connection with the mobile application, addressing the practical considerations of device operation and maintenance.
In an embodiment, the integration of a solar panel into the device's design offers an auxiliary charging option. This feature enhances the device's usability and energy efficiency, providing a sustainable solution to battery life management and contributing to the overall reliability of the navigational aid.
In an embodiment, the mobile application includes a feature allowing the user to send an SOS signal with their location to pre-selected contacts in case of emergency. This functionality provides an additional layer of safety, ensuring that users can quickly and easily reach out for help if necessary.
A method for assisting navigation of visually impaired individuals using the navigational aid device has been established. This method encompasses capturing real-time imagery of the surroundings with the camera and analyzing this imagery to detect obstacles using object detection algorithms. Audio feedback is provided through the audio feedback mechanism to indicate the presence and location of detected obstacles. Users are alerted to low-height obstacles through vibrations generated by the vibration alert system, and are guided with directional cues from the audio guidance system based on the detected obstacles and user’s orientation. A Bluetooth connection is maintained with a GPS-enabled mobile application through the connectivity module for real-time location tracking and to enable access to device settings, software updates, and user manuals, ensuring a comprehensive navigational assistance framework for visually impaired individuals.
Brief Description of the Drawings
The features and advantages of the present disclosure would be more clearly understood from the following description taken in conjunction with the accompanying drawings in which:
FIG. 1 illustrates navigational aid device for visually impaired individuals, in accordance with the embodiments of the present disclosure.
FIG. 2 illustrates a method for assisting navigation of visually impaired individuals using the navigational aid device, in accordance with the embodiments of the present disclosure.
FIG. 3 illustrates a flow diagram for assisting navigation of visually impaired individuals using the navigational aid device, in accordance with the embodiments of the present disclosure.
Detailed Description
In the following detailed description of the invention, reference is made to the accompanying drawings that form a part hereof, and in which is shown, by way of illustration, specific embodiments in which the invention may be practiced. In the drawings, like numerals describe substantially similar components throughout the several views. These embodiments are described in sufficient detail to claim those skilled in the art to practice the invention. Other embodiments may be utilized and structural, logical, and electrical changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims and equivalents thereof.
The use of the terms “a” and “an” and “the” and “at least one” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The use of the term “at least one” followed by a list of one or more items (for example, “at least one of A and B”) is to be construed to mean one item selected from the listed items (A or B) or any combination of two or more of the listed items (A and B), unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including,” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
Pursuant to the "Detailed Description" section herein, whenever an element is explicitly associated with a specific numeral for the first time, such association shall be deemed consistent and applicable throughout the entirety of the "Detailed Description" section, unless otherwise expressly stated or contradicted by the context.
FIG. 1 illustrates navigational aid device (100) for visually impaired individuals, in accordance with the embodiments of the present disclosure. The development of the navigational aid device (100) represents a significant advancement in technology aimed at assisting visually impaired individuals. Central to this device is a camera (102), which has been meticulously designed to capture and analyze the surrounding environment in real-time. The application of object detection algorithms enables the camera (102) to swiftly identify and classify a variety of obstacles such as people, animals, and inanimate objects that might pose a risk to the user. This capability is of paramount importance as it facilitates the immediate recognition of potential hazards, allowing the user to make informed decisions about their movements. The accuracy and speed of the camera's analysis are critical, as they directly impact the effectiveness of the device in real-world scenarios. By employing advanced algorithms, the camera (102) ensures that visually impaired users receive reliable and up-to-date information about their surroundings, thereby enhancing their ability to navigate safely and independently.
Incorporated within the navigational aid device is an audio feedback mechanism (104), designed to convey information about detected obstacles through auditory signals. This mechanism employs a sophisticated system to transform the visual data captured by the camera into sound cues that indicate the location and proximity of obstacles. The design of these auditory signals takes into consideration the need for them to be immediately understandable, allowing users to quickly assess and react to their environment. The audio feedback mechanism (104) serves as an essential component of the device, offering a non-visual means for users to perceive their surroundings. The implementation of this feature underscores the commitment to creating a navigational aid that is accessible, intuitive, and effective in helping visually impaired individuals navigate with confidence.
The device also features a vibration alert system (106), which is another layer of safety for the user. This system utilizes ultrasonic sensors to detect obstacles that are low to the ground and might not be easily identified through auditory signals or visual analysis by the camera. Once such obstacles are detected, the system (106) triggers a series of vibrations to alert the user. The intensity and pattern of these vibrations are carefully calibrated to convey specific information about the nature and proximity of the obstacle, enabling the user to make quick adjustments to their path. This tactile feedback mechanism is particularly useful in crowded or noisy environments where auditory signals might be less effective. By integrating this system, the device ensures that users are aware of all types of obstacles, including those that are traditionally more challenging to detect, thereby significantly reducing the risk of accidents and enhancing overall safety.
An additional feature of the navigational aid device is the audio guidance system (108), which provides users with verbal instructions and directional cues. This system is designed with customization options for volume and language, accommodating the diverse needs and preferences of users. The guidance offered is dynamic, adjusting based on the user's orientation and the obstacles detected by the camera and ultrasonic sensors. This adaptability ensures that the advice is relevant and timely, empowering users to navigate complex environments more effectively. The audio guidance system (108) not only aids in obstacle avoidance but also supports users in maintaining their desired path, contributing to a sense of independence and confidence in mobility.
Finally, the connectivity module (110) enhances the device's functionality by enabling integration with a GPS-enabled mobile application. This feature allows for real-time location tracking, a crucial aspect for navigation in unfamiliar settings. The module (110) also facilitates access to device settings, allowing users to personalize their experience according to their needs. Software updates and user manuals are readily available through the mobile application, ensuring that the device remains up-to-date with the latest advancements and that users have access to comprehensive support. This connectivity not only improves the user experience but also ensures that the navigational aid device remains a reliable and valuable tool for visually impaired individuals, aiding them in their daily lives and contributing to their autonomy.
In an embodiment, the navigational aid device (100), wherein the camera (102) employs advanced machine learning techniques for the improved accuracy of obstacle identification and classification, has been developed. Advanced machine learning techniques, including deep learning and convolutional neural networks, are utilized by the camera (102) to enhance its capability to accurately identify and classify obstacles in the user's environment. These techniques allow for the analysis of complex visual data in real time, enabling the device to distinguish between various types of obstacles such as people, animals, and inanimate objects with high precision. The employment of such sophisticated algorithms ensures that the camera (102) can adapt to diverse environments and lighting conditions, significantly reducing the likelihood of false positives and negatives. This increased accuracy is crucial for the safety and confidence of visually impaired users, as it directly impacts the reliability of the navigational aid. By leveraging the latest advancements in machine learning, the camera (102) provides a foundation upon which the rest of the device's functionalities can effectively operate, thereby enhancing the overall user experience.
In an embodiment, the navigational aid device (100) further comprises an ergonomic handle designed to provide comfort and ease of use for extended periods. The design of the ergonomic handle takes into account the varied needs of users, incorporating features such as a non-slip surface, lightweight materials, and a shape that conforms to the natural grip of the hand. These design considerations ensure that the device can be held comfortably for long durations without causing fatigue or discomfort, which is particularly important for visually impaired individuals who rely on the device for navigation throughout the day. The ergonomic handle also includes accessible control buttons that allow users to operate the device's features without the need to adjust their grip or use both hands, thereby maintaining ease of use and accessibility.
In an embodiment, the navigational aid device (100), wherein the audio feedback mechanism (104) includes a speaker system capable of adjusting its output based on ambient noise levels to ensure clear communication of auditory signals, has been integrated. This adaptive speaker system utilizes advanced sound processing technologies to monitor the ambient noise levels and automatically adjust the volume and clarity of the auditory signals accordingly. By doing so, the device ensures that the audio feedback remains audible and understandable in various environments, whether in a quiet room or a noisy outdoor setting. This feature is essential for providing consistent and reliable guidance to visually impaired users, enabling them to navigate safely and effectively regardless of the surrounding noise.
In an embodiment, the navigational aid device (100), wherein the vibration alert system (106) is capable of varying the intensity of vibrations based on the proximity of the detected low-height obstacles, is described. This functionality allows the device to convey detailed information about the proximity of obstacles through tactile feedback. The closer an obstacle, the more intense the vibration, providing an intuitive method for users to understand their immediate surroundings. This feature is particularly useful for detecting obstacles that are not within the direct line of sight or are below the detection range of the camera (102) and audio feedback mechanism (104), such as curbs and low-lying objects. By offering a graded response based on the distance to obstacles, the vibration alert system (106) enhances the navigational capabilities of the device, contributing to safer and more confident mobility for visually impaired users.
In an embodiment, the navigational aid device (100), wherein the audio guidance system (108) is further configured to provide navigational cues through bone conduction headphones, offering an alternative to traditional auditory feedback for users with varying degrees of hearing impairment, has been innovated. Bone conduction technology allows for the transmission of sound through the vibrations of the user's cheekbones, directly to the inner ear, bypassing the outer and middle ear. This method of sound transmission can be particularly beneficial for users with hearing impairments that affect the conventional pathways of sound. By integrating bone conduction headphones with the audio guidance system (108), the device ensures that navigational cues and obstacle warnings are accessible to a broader audience, including those who may not benefit from traditional headphones or speaker systems.
In an embodiment, the navigational aid device (100), wherein the connectivity module (110) utilizes low energy Bluetooth technology to maximize the device’s battery life while maintaining a stable connection with the mobile application, is developed. The use of low energy Bluetooth technology is instrumental in ensuring that the device remains operational for extended periods without the need for frequent recharging. This technology provides a reliable and energy-efficient method for the device to communicate with the associated mobile application, facilitating access to real-time location tracking, device settings, and software updates. The extended battery life and stable connectivity are crucial for users relying on the device for day-to-day navigation, offering them peace of mind and uninterrupted support.
In an embodiment, the navigational aid device (100) further comprises a solar panel integrated into the device's design for auxiliary charging, enhancing its usability and energy efficiency. The inclusion of a solar panel allows the device to harness solar energy, providing an additional charging option that can extend the battery life and reduce dependency on traditional charging methods. This feature is particularly advantageous for users who spend extended periods outdoors or in areas where access to power outlets may be limited. The solar panel is designed to be lightweight and unobtrusive, ensuring that it does not compromise the device's portability or functionality while offering a sustainable and convenient charging solution.
In an embodiment, the navigational aid device (100), wherein the mobile application includes a feature for the user to send an SOS signal with their location to pre-selected contacts in case of emergency, is featured. This safety feature allows users to quickly and easily alert trusted individuals if they find themselves in a situation requiring assistance. The SOS signal, accompanied by the user's real-time location, is sent through the mobile application, ensuring that help can be dispatched accurately and efficiently. This feature adds an important layer of security for visually impaired users, giving them and their loved ones peace of mind knowing that they have a direct line of assistance should an emergency arise.
FIG. 2 illustrates a method (200) for assisting navigation of visually impaired individuals using the navigational aid device (100), in accordance with the embodiments of the present disclosure. At step (202) the camera (102) is engaged to capture real-time imagery of the surroundings, with object detection algorithms analyzing this imagery to identify and classify obstacles. This step ensures immediate recognition and processing of environmental data to detect potential hazards. At step (204) upon identification of obstacles, auditory signals are generated by the audio feedback mechanism (104). These signals are designed to communicate the presence and precise location of obstacles, providing clear and immediate guidance to the user about their surroundings. At step (206) for low-height obstacles not easily detected by visual or auditory feedback, the vibration alert system (106) generates tactile vibrations. The intensity of these vibrations corresponds to the proximity of the obstacle, offering an intuitive means for users to be aware of and navigate around such hazards. At step (208) based on the detected obstacles and the user's orientation, the audio guidance system (108) provides directional cues. These cues assist in guiding the user around obstacles, offering safe and effective navigation through varying environments. At step (210) the connectivity module (110) maintains a stable Bluetooth connection with a GPS-enabled mobile application. This connection facilitates real-time location tracking and provides access to device settings, software updates, and user manuals, enhancing the usability and functionality of the navigational aid device.
FIG. 3 illustrates a flow diagram for assisting navigation of visually impaired individuals using the navigational aid device (100), in accordance with the embodiments of the present disclosure. The process commences with the camera (102) capturing video footage in real-time, which is then analyzed by object detection algorithms to identify any obstacles. If low-height obstacles are detected by the ultrasonic sensors, the device triggers vibration alerts to warn the user. Concurrently, if any obstacles are identified, the audio feedback mechanism (104) provides auditory instructions to inform the user about the obstacle's location. If no obstacles are found, the user is guided to proceed. Furthermore, the diagram indicates a decision point regarding the connectivity of the user's cane with a mobile application. If the cane is connected, the device continues to send essential navigational cues and real-time location updates to assist the user. However, if the cane is not connected, the mobile application is prompted to initiate a sequence to make a call to the user, ensuring they receive necessary guidance and updates for continued safe navigation. This process encapsulates a comprehensive approach to aid visually impaired individuals in orienting themselves and moving confidently within their environment.
Example embodiments herein have been described above with reference to block diagrams and flowchart illustrations of methods and apparatuses. It will be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by various means including hardware, software, firmware, and a combination thereof. For example, in one embodiment, each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations can be implemented by computer program instructions. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks.
Throughout the present disclosure, the term ‘processing means’ or ‘microprocessor’ or ‘processor’ or ‘processors’ includes, but is not limited to, a general purpose processor (such as, for example, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a microprocessor implementing other types of instruction sets, or a microprocessor implementing a combination of types of instruction sets) or a specialized processor (such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), or a network processor).
The term “non-transitory storage device” or “storage” or “memory,” as used herein relates to a random access memory, read only memory and variants thereof, in which a computer can store data or software for any duration.
Operations in accordance with a variety of aspects of the disclosure is described above would not have to be performed in the precise order described. Rather, various steps can be handled in reverse order or simultaneously or not at all.
While several implementations have been described and illustrated herein, a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein may be utilized, and each of such variations and/or modifications is deemed to be within the scope of the implementations described herein. More generally, all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific implementations described herein. It is, therefore, to be understood that the foregoing implementations are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, implementations may be practiced otherwise than as specifically described and claimed. Implementations of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
Claims
I/We claims:
A navigational aid device (100) for visually impaired individuals comprising:
a camera (102) configured for real-time capture and analysis of surrounding imagery using object detection algorithms to identify and classify obstacles, including but not limited to, people, animals, and objects;
an audio feedback mechanism (104) to provide auditory signals indicating the presence and location of detected obstacles relative to the user’s path;
a vibration alert system (106) integrated with ultrasonic sensors for notifying the user of low-height obstacles;
an audio guidance system (108) with customizable volume and language settings, offering directional cues based on detected obstacles and user orientation; and
a connectivity module (110) for linking with a GPS-enabled mobile application, allowing real-time location tracking and access to device settings, software updates, and user manuals.
The navigational aid device (100) of claim 1, wherein the camera (102) employs advanced machine learning techniques for the improved accuracy of obstacle identification and classification.
The navigational aid device (100) of claim 1, further comprising an ergonomic handle designed to provide comfort and ease of use for extended periods.
The navigational aid device (100) of claim 1, wherein the audio feedback mechanism (104) includes a speaker system capable of adjusting its output based on ambient noise levels to ensure clear communication of auditory signals.
The navigational aid device (100) of claim 1, wherein the vibration alert system (106) is capable of varying the intensity of vibrations based on the proximity of the detected low-height obstacles.
The navigational aid device (100) of claim 1, wherein the audio guidance system (108) is further configured to provide navigational cues through bone conduction headphones, offering an alternative to traditional auditory feedback for users with varying degrees of hearing impairment.
The navigational aid device (100) of claim 1, wherein the connectivity module (110) utilizes low energy Bluetooth technology to maximize the device’s battery life while maintaining a stable connection with the mobile application.
The navigational aid device (100) of claim 1, further comprising a solar panel integrated into the device's design for auxiliary charging, enhancing its usability and energy efficiency.
The navigational aid device (100) of claim 1, wherein the mobile application includes a feature for the user to send an SOS signal with their location to pre-selected contacts in case of emergency.
A method for assisting navigation of visually impaired individuals using the navigational aid device (100) comprising:
capturing real-time imagery of the surroundings with the camera (102) and analyzing the imagery to detect obstacles using object detection algorithms;
providing audio feedback through the audio feedback mechanism (104) to indicate the presence and location of detected obstacles;
alerting the user to low-height obstacles through vibrations generated by the vibration alert system (106);
guiding the user with directional cues from the audio guidance system (108) based on the detected obstacles and user’s orientation;
maintaining a Bluetooth connection with a GPS-enabled mobile application through the connectivity module (110) for real-time location tracking and enabling access to device settings, software updates, and user manuals.
NAVIGATIONAL AID DEVICE FOR VISUALLY IMPAIRED INDIVIDUALS
Disclosed is a navigational aid device for visually impaired individuals comprising: a camera configured for real-time capture and analysis of surrounding imagery using object detection algorithms to identify and classify obstacles, including but not limited to, people, animals, and objects; an audio feedback mechanism to provide auditory signals indicating the presence and location of detected obstacles relative to the user’s path; a vibration alert system integrated with ultrasonic sensors for notifying the user of low-height obstacles; an audio guidance system with customizable volume and language settings, offering directional cues based on detected obstacles and user orientation; and a connectivity module for linking with a GPS-enabled mobile application, allowing real-time location tracking and access to device settings, software updates, and user manuals.
Fig. 1
Drawings
/
FIG. 1
/
FIG. 2
/
FIG. 3
, Claims:I/We claims:
A navigational aid device (100) for visually impaired individuals comprising:
a camera (102) configured for real-time capture and analysis of surrounding imagery using object detection algorithms to identify and classify obstacles, including but not limited to, people, animals, and objects;
an audio feedback mechanism (104) to provide auditory signals indicating the presence and location of detected obstacles relative to the user’s path;
a vibration alert system (106) integrated with ultrasonic sensors for notifying the user of low-height obstacles;
an audio guidance system (108) with customizable volume and language settings, offering directional cues based on detected obstacles and user orientation; and
a connectivity module (110) for linking with a GPS-enabled mobile application, allowing real-time location tracking and access to device settings, software updates, and user manuals.
The navigational aid device (100) of claim 1, wherein the camera (102) employs advanced machine learning techniques for the improved accuracy of obstacle identification and classification.
The navigational aid device (100) of claim 1, further comprising an ergonomic handle designed to provide comfort and ease of use for extended periods.
The navigational aid device (100) of claim 1, wherein the audio feedback mechanism (104) includes a speaker system capable of adjusting its output based on ambient noise levels to ensure clear communication of auditory signals.
The navigational aid device (100) of claim 1, wherein the vibration alert system (106) is capable of varying the intensity of vibrations based on the proximity of the detected low-height obstacles.
The navigational aid device (100) of claim 1, wherein the audio guidance system (108) is further configured to provide navigational cues through bone conduction headphones, offering an alternative to traditional auditory feedback for users with varying degrees of hearing impairment.
The navigational aid device (100) of claim 1, wherein the connectivity module (110) utilizes low energy Bluetooth technology to maximize the device’s battery life while maintaining a stable connection with the mobile application.
The navigational aid device (100) of claim 1, further comprising a solar panel integrated into the device's design for auxiliary charging, enhancing its usability and energy efficiency.
The navigational aid device (100) of claim 1, wherein the mobile application includes a feature for the user to send an SOS signal with their location to pre-selected contacts in case of emergency.
A method for assisting navigation of visually impaired individuals using the navigational aid device (100) comprising:
capturing real-time imagery of the surroundings with the camera (102) and analyzing the imagery to detect obstacles using object detection algorithms;
providing audio feedback through the audio feedback mechanism (104) to indicate the presence and location of detected obstacles;
alerting the user to low-height obstacles through vibrations generated by the vibration alert system (106);
guiding the user with directional cues from the audio guidance system (108) based on the detected obstacles and user’s orientation;
maintaining a Bluetooth connection with a GPS-enabled mobile application through the connectivity module (110) for real-time location tracking and enabling access to device settings, software updates, and user manuals.
NAVIGATIONAL AID DEVICE FOR VISUALLY IMPAIRED INDIVIDUALS
| # | Name | Date |
|---|---|---|
| 1 | 202421033107-OTHERS [26-04-2024(online)].pdf | 2024-04-26 |
| 2 | 202421033107-FORM FOR SMALL ENTITY(FORM-28) [26-04-2024(online)].pdf | 2024-04-26 |
| 3 | 202421033107-FORM 1 [26-04-2024(online)].pdf | 2024-04-26 |
| 4 | 202421033107-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [26-04-2024(online)].pdf | 2024-04-26 |
| 5 | 202421033107-EDUCATIONAL INSTITUTION(S) [26-04-2024(online)].pdf | 2024-04-26 |
| 6 | 202421033107-DRAWINGS [26-04-2024(online)].pdf | 2024-04-26 |
| 7 | 202421033107-DECLARATION OF INVENTORSHIP (FORM 5) [26-04-2024(online)].pdf | 2024-04-26 |
| 8 | 202421033107-COMPLETE SPECIFICATION [26-04-2024(online)].pdf | 2024-04-26 |
| 9 | 202421033107-FORM-9 [07-05-2024(online)].pdf | 2024-05-07 |
| 10 | 202421033107-FORM 18 [08-05-2024(online)].pdf | 2024-05-08 |
| 11 | 202421033107-FORM-26 [13-05-2024(online)].pdf | 2024-05-13 |
| 12 | 202421033107-FORM 3 [13-06-2024(online)].pdf | 2024-06-13 |
| 13 | 202421033107-RELEVANT DOCUMENTS [17-04-2025(online)].pdf | 2025-04-17 |
| 14 | 202421033107-POA [17-04-2025(online)].pdf | 2025-04-17 |
| 15 | 202421033107-FORM 13 [17-04-2025(online)].pdf | 2025-04-17 |
| 16 | 202421033107-FER.pdf | 2025-09-11 |
| 1 | 202421033107_SearchStrategyNew_E_SearchstrategyE_11-09-2025.pdf |