Abstract: SMART SHOE WITH ROUTE SUGGESTIONS FOR VISUALLY IMPAIRED USERS ABSTRACT A smart shoe (100) with route suggestions for visually impaired users is disclosed. The shoe (100) comprises a first data collection unit (104a) to measure a first distance between upcoming obstacles from a right side and the user. A second data collection unit (104b) to measure a second distance between the upcoming obstacles from a left side and the user. A first feedback unit (106a) and a second feedback unit (106b) to generate a pulsated haptic vibration for navigating the user on a route. A microcontroller (108) is configured to receive the first measured distance and the second measured distance; analyze a position of the upcoming obstacles; map an intermediary route; and navigate the user on the mapped route. The shoe (100) analyzes the environment and chooses the safest direction to move, enhancing decision-making and reducing dependence on external assistance or internet connectivity. Claims: 10, Figures: 3 Figure 1 is selected.
Description:
BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a shoe and particularly to a smart shoe with route suggestions for visually impaired users.
Description of Related Art
[002] Mobility presents a daily challenge for individuals with visual impairments, particularly in unfamiliar or complex environments. Traditional mobility aids such as white canes and guide dogs serve essential roles in obstacle detection and spatial awareness. However, these tools often lack the capability to detect hazards at a distance or suggest alternative, safer routes. In crowded or dynamically changing spaces, these limitations result in decreased confidence and an increased risk of accidents for users.
[003] Recent advancements in assistive technologies have led to the development of wearable devices that incorporate sensors, audio feedback, and artificial intelligence. Smart canes, for instance, integrate ultrasonic sensors and offer voice prompts to alert users about nearby objects. Wristbands like the Sunu Band utilize echolocation to signal obstacle presence through vibrations. Other solutions, such as smart glasses and mobile phone applications, offer object recognition, environmental descriptions, and even human-assisted navigation through internet-based platforms. These technologies demonstrate clear improvements over traditional tools but often require the user to hold, wear, or interact with external devices continuously.
[004] Despite these innovations, many assistive solutions still present practical drawbacks. Devices like smart canes and mobile apps rely heavily on user input or sustained connectivity, that are not reliable in all situations. Wearable glasses are unable to function well in low-light conditions and often carry high costs, limiting accessibility. Smart shoes have also emerged, primarily in fitness and safety contexts, but existing versions fail to provide comprehensive navigation features tailored to visually impaired individuals.
[005] There is thus a need for an improved and advanced smart shoe with route suggestions for visually impaired users that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a smart shoe with route suggestions for visually impaired users. The shoe comprising a first data collection unit, installed at a front of a right shoe, adapted to measure a first distance between a user and upcoming obstacles from a right side of the user. The shoe further comprising a second data collection unit, installed at the front of a left shoe, adapted to measure a second distance between the user and the upcoming obstacles from a left side of the user. The shoe further comprising a first feedback unit, installed over an insole of the right shoe, adapted to generate a pulsated haptic vibration for navigating the user on a route. The shoe further comprising a second feedback unit, installed over the insole of the left shoe, adapted to generate the pulsated haptic vibration for navigating the user on the route. The shoe further comprising a microcontroller communicatively connected to the first data collection unit, the second data collection unit, the first feedback unit, and to the second feedback unit. The microcontroller is configured to receive the first measured distance from the first data collection unit; receive the second measured distance from the second data collection unit; analyze a position of the upcoming obstacles using an Artificial Intelligence (AI) algorithm; map an intermediary route for circumventing the upcoming obstacles; and actuate the first feedback unit and the second feedback unit consecutively for navigating the user on the mapped route.
[007] Embodiments in accordance with the present invention further provide a method for route suggestions to visually impaired users. The method comprising steps of receiving a first measured distance from a first data collection unit; receiving a second measured distance from a second data collection unit; analyzing a position of upcoming obstacles using an Artificial Intelligence (AI) algorithm; mapping an intermediary route for circumventing the upcoming obstacles; and actuating a first feedback unit and a second feedback unit consecutively for navigating the user on the mapped route.
[008] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a smart shoe with route suggestions for visually impaired user.
[009] Next, embodiments of the present application may provide a smart shoe that operates without the need for any handheld devices, allowing the user to walk freely and comfortably.
[0010] Next, embodiments of the present application may provide a smart shoe that detects obstacles in real time and immediately provide directional guidance through vibration signals, enabling prompt and safe route adjustments.
[0011] Next, embodiments of the present application may provide a smart shoe that analyzes the environment and chooses the safest direction to move, enhancing decision-making and reducing dependence on external assistance or internet connectivity.
[0012] Next, embodiments of the present application may provide a smart shoe that work effectively in dark or noisy environments, as they rely solely on ultrasonic detection and silent vibration cues.
[0013] Next, embodiments of the present application may provide a smart shoe that is embedded within regular-looking footwear, eliminating the social stigma or discomfort often associated with assistive technologies while ensuring daily usability and convenience.
[0014] These and other advantages will be apparent from the present application of the embodiments described herein.
[0015] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0017] FIG. 1 illustrates a schematic block diagram of a smart shoe with route suggestions for visually impaired users, according to an embodiment of the present invention;
[0018] FIG. 2 illustrates a block diagram of a microcontroller, according to an embodiment of the present invention; and
[0019] FIG. 3 depicts a flowchart of a method for route suggestions to visually impaired users, according to an embodiment of the present invention.
[0020] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0021] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0022] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0023] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0024] FIG. 1 illustrates a schematic block diagram of a smart shoe 100 (hereinafter referred to as the shoe 100) with route suggestions for visually impaired users, according to an embodiment of the present invention. In an embodiment of the present invention, the shoe 100 may be adapted to sense upcoming obstacles around a user and may return vibrating feedback for intimating the user about the upcoming obstacles. The vibrating feedback may be in a staggered manner that may further enable a navigation of the user circumventing upcoming obstacles. Further, the shoe 100 may be automatously activated, upon wearing by the user. The autonomous activation may eliminate any step of switching on and/or powering on and configuration of the shoe 100, making the shoe 100 suitable for the for visually impaired users. Thus, the for visually impaired users may directly wear the shoe 100 and the shoe 100 may be automatously activated.
[0025] According to the embodiments of the present invention, the shoe 100 may incorporate non-limiting hardware components to enhance the processing speed and efficiency such as the shoe 100 may comprise a first data collection unit 104a, a second data collection unit 104b, a first feedback unit 106a, a second feedback unit 106b, a microcontroller 108, a rechargeable battery 110, and a Universal Serial Bus (USB) charger 112. In an embodiment of the present invention, the hardware components of the shoe 100 may be integrated with computer-executable instructions for overcoming the challenges and the limitations of the existing shoes. Further, the components of the shoe 100 may be rigidly and securely encompassed in the shoe 100, enabling the shoe 100 to withstand daily wear.
[0026] In an embodiment of the present invention, the first data collection unit 104a may be installed at a front of a right shoe 102a. The first data collection unit 104a may be adapted to measure a first distance between the upcoming obstacles from a right side and the user. In an embodiment of the present invention, the second data collection unit 104b may be installed at the front of a left shoe 102b. The second data collection unit 104b may be adapted to measure a second distance between the upcoming obstacles from a left side and the user. The first data collection unit 104a and the second data collection unit 104b may comprise an array of ultrasonic sensors.
[0027] In an embodiment of the present invention, the first feedback unit 106a may be installed over an insole of the right shoe 102a. The first feedback unit 106a may be adapted to generate a pulsated haptic vibration for navigating the user on a route. The actuation of the first feedback unit 106a indicates the obstacles on a right side and indicates the user to move in a left direction. In an embodiment of the present invention, the second feedback unit 106b may be installed over the insole of the left shoe 102b. The second feedback unit 106b may be adapted to generate the pulsated haptic vibration for navigating the user on the route. The actuation of the second feedback unit 106b indicates the obstacles on a left side and indicates the user to move in a right direction. The actuation of the first feedback unit 106a and the second feedback unit 106b indicates the user to stop or find an alternative route. The first feedback unit 106a and the second feedback unit 106b comprise a set of vibrational motors.
[0028] In an embodiment of the present invention, the microcontroller 108 communicatively connected to the first data collection unit 104a, the second data collection unit 104b, the first feedback unit 106a, and to the second feedback unit 106b. The microcontroller 108 may further be configured to execute computer-executable instructions to generate an output relating to the shoe 100. The microcontroller 108 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the microcontroller 108 including known, related art, and/or later developed technologies. In an embodiment of the present invention, the microcontroller 108 may further be explained in conjunction with FIG. 2.
[0029] In an embodiment of the present invention, the rechargeable battery 110 may be adapted to supply an operational power to the microcontroller 108. In an embodiment of the present invention, the rechargeable battery 110 may be separately installed in the right shoe 102a and the left shoe 102b. The rechargeable battery 110 may be collectively installed together for the right shoe 102a and the left shoe 102b and may be connected using wires. The rechargeable battery 110 for power supply may be of any composition such as, but not limited to, a Nickel-Cadmium battery, a Nickel-Metal Hydride battery, a Zinc-Carbon battery, a Lithium-Ion battery, and so forth. Embodiments of the present invention are intended to include or otherwise cover any composition of the rechargeable battery 110, including known, related art, and/or later developed technologies. The rechargeable battery 110 may be charged using the Universal Serial Bus (USB) charger 112.
[0030] FIG. 2 illustrates a block diagram of the microcontroller 108, according to an embodiment of the present invention. The microcontroller 108 may comprise the computer-executable instructions in form of programming modules such as a data receiving module 200, a data analyzing module 202, a data mapping module 204, and an actuation module 206.
[0031] In an embodiment of the present invention, the data receiving module 200 may be configured to receive the first measured distance from the first data collection unit 104a. The data receiving module 200 may be configured to receive the second measured distance from the second data collection unit 104b. The data receiving module 200 may be configured to transmit the first measured distance and the second measured distance to the data analyzing module 202.
[0032] The data analyzing module 202 may be activated upon receipt of the first measured distance and the second measured distance from the data receiving module 200. In an embodiment of the present invention, the data analyzing module 202 may be configured to analyze a position of the upcoming obstacles using an Artificial Intelligence (AI) algorithm. Further, the analysis of the position of the upcoming obstacles may be conducted in real-time enabling the shoe 100 to visualize spontaneously occurring obstacles such as, but not limited to, kids, cars, and so forth. The Artificial Intelligence (AI) algorithm may be, but not limited to, an A* search algorithm, a Depth First Search (DFS), and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the Artificial Intelligence (AI) algorithm, including known, related art, and/or later developed technologies. The data analyzing module 202 may be configured to transmit the analyzed position of the upcoming obstacles to the data mapping module 204.
[0033] The data mapping module 204 may be configured to receive the analyzed position of the upcoming obstacles from the data analyzing module 202. The data mapping module 204 may be configured to map an intermediary route for circumventing the upcoming obstacles. The data mapping module 204 may be configured to remap the intermediary route in real-time in occurrence of the spontaneously occurring obstacles. The data mapping module 204 may be configured to transmit the intermediary route to the actuation module 206.
[0034] The actuation module 206 may be activated upon receipt of the intermediary route from the data mapping module 204. The actuation module 206 may be configured to actuate the first feedback unit 106a and the second feedback unit 106b consecutively for navigating the user on the mapped route.
[0035] In an exemplary embodiment of the present invention, a blind person may be wearing the smart shoe 100. The blind person may be required to cross a road and reach a pharmacy located on the opposite side of a busy street, such as both static and dynamic obstacles may be present along the route. As the blind person approaches the roadside, the first data collection unit 104a and the second data collection unit 104b may be configured to continuously scan the surrounding environment for the obstacles. The data receiving module 200 may be configured to receive the real-time measured distance data from the first data collection unit 104a and the second data collection unit 104b. The obstacles may static objects such as the curb, parked vehicles, and poles, as well as dynamic objects such as pedestrians and moving vehicles etc.
[0036] Upon receiving the data, the data receiving module 200 may be configured to transmit the received data to the data analyzing module 202. The data analyzing module 202 may be configured to process the received data using the Artificial Intelligence (AI) algorithm. The data analyzing module 202 may be configured to identify and analyze both stationary and dynamically moving obstacles, estimating their location and motion trajectory in real-time. Following the analysis, the data analyzing module 202 may be configured to transmit the processed obstacle data to the data mapping module 204. The data mapping module 204 may be configured to generate an intermediary route that enables the user to safely navigate around the identified obstacles. In the event of new, spontaneously occurring obstacles such as a child running into the path or a vehicle suddenly turning into the street, the data mapping module 204 may be configured to remap the route in real time and adapt accordingly. Once the updated intermediary route is calculated, the updated intermediary route may be transmitted to the actuation module 206. The actuation module 206 may be configured to sequentially activate the first feedback unit 106a and the second feedback unit 106b to provide haptic feedback signals guiding the user along the new route. For instance, if a leftward movement is required, the first feedback unit 106a (e.g., located on the left foot) may be configured to vibrate, signaling the user to turn left. If stopping is necessary, both feedback units may be configured to deliver a specific pattern of vibrations to instruct the blind person to halt. The smart shoe 100 may enable the blind person to navigate independently by receiving continuous real-time guidance that may adapt to changes in the environment. This may enhance user safety and improve mobility in complex, real-world scenarios.
[0037] FIG. 3 depicts a flowchart of a method 300 for route suggestions to visually impaired users using the shoe 100, according to an embodiment of the present invention.
[0038] At step 302, the shoe 100 may receive the first measured distance from the first data collection unit 104a and the second measured distance from the second data collection unit 104b.
[0039] At step 304, the shoe 100 may analyze the position of the upcoming obstacles using the Artificial Intelligence (AI) algorithm.
[0040] At step 306, the shoe 100 may map the intermediary route for circumventing the upcoming obstacles.
[0041] At step 308, the shoe 100 may actuate the first feedback unit 106a and the second feedback unit 106b consecutively for navigating the user on the mapped route.
[0042] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0043] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. A smart shoe (100) with route suggestions for visually impaired users, the shoe (100) comprising:
a first data collection unit (104a), installed at a front of a right shoe (102a), adapted to measure a first distance between a user and upcoming obstacles from a right side of the user;
a second data collection unit (104b), installed at the front of a left shoe (102b), adapted to measure a second distance between the user and the upcoming obstacles from a left side of the user;
a first feedback unit (106a), installed over an insole of the right shoe (102a), adapted to generate a pulsated haptic vibration for navigating the user on a route;
a second feedback unit (106b), installed over the insole of the left shoe (102b), adapted to generate the pulsated haptic vibration for navigating the user on the route; and
a microcontroller (108) communicatively connected to the first data collection unit (104a), the second data collection unit (104b), the first feedback unit (106a), and to the second feedback unit (106b), characterized in that the microcontroller (108) is configured to:
receive the first measured distance from the first data collection unit (104a);
receive the second measured distance from the second data collection unit (104b);
analyze a position of the upcoming obstacles using an Artificial Intelligence (AI) algorithm;
map an intermediary route for circumventing the upcoming obstacles; and
actuate the first feedback unit (106a) and the second feedback unit (106b) consecutively for navigating the user on the mapped route.
2. The shoe (100) as claimed in claim 1, wherein actuation of the first feedback unit (106a) is configured to indicate the presence of an obstacle on a right side of the user and to prompt the user to move in a left direction.
3. The shoe (100) as claimed in claim 1, wherein actuation of the second feedback unit (106b) is configured to indicate the presence of an obstacle on a left side of the user and to prompt the user to move in a right direction.
4. The shoe (100) as claimed in claim 1, wherein simultaneous actuation of the first feedback unit (106a) and the second feedback unit (106b) is configured to indicate that the user should stop or identify an alternative route..
5. The shoe (100) as claimed in claim 1, wherein the first data collection unit (104a) and the second data collection unit (104b) may comprise an array of ultrasonic sensors.
6. The shoe (100) as claimed in claim 1, wherein the first feedback unit (106a) and the second feedback unit (106b) comprise a set of vibrational motors.
7. The shoe (100) as claimed in claim 1, comprising a rechargeable battery (110) adapted to supply an operational power to the microcontroller (108), wherein the rechargeable battery (110) is charged using a Universal Serial Bus (USB) charger (112).
8. A method (300) for route suggestions to visually impaired users, the method (300) is characterized by steps of:
receiving a first measured distance from a first data collection unit (104a);
receiving a second measured distance from a second data collection unit (104b);
analyzing a position of upcoming obstacles using an Artificial Intelligence (AI) algorithm;
mapping an intermediary route for circumventing the upcoming obstacles; and
actuating a first feedback unit (106a) and a second feedback unit (106b) consecutively for navigating the user on the mapped route.
9. The method (300) as claimed in claim 8, wherein the first feedback unit (106a) and the second feedback unit (106b) comprise a set of vibrational motors.
10. The method (300) as claimed in claim 8, comprising a rechargeable battery (110) adapted to supply an operational power to the microcontroller (108), wherein the rechargeable battery (110) is charged using a Universal Serial Bus (USB) charger (112).
Date: May 19, 2025
Place: Noida
Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant
| # | Name | Date |
|---|---|---|
| 1 | 202541048343-STATEMENT OF UNDERTAKING (FORM 3) [20-05-2025(online)].pdf | 2025-05-20 |
| 2 | 202541048343-REQUEST FOR EARLY PUBLICATION(FORM-9) [20-05-2025(online)].pdf | 2025-05-20 |
| 3 | 202541048343-POWER OF AUTHORITY [20-05-2025(online)].pdf | 2025-05-20 |
| 4 | 202541048343-OTHERS [20-05-2025(online)].pdf | 2025-05-20 |
| 5 | 202541048343-FORM-9 [20-05-2025(online)].pdf | 2025-05-20 |
| 6 | 202541048343-FORM FOR SMALL ENTITY(FORM-28) [20-05-2025(online)].pdf | 2025-05-20 |
| 7 | 202541048343-FORM 1 [20-05-2025(online)].pdf | 2025-05-20 |
| 8 | 202541048343-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-05-2025(online)].pdf | 2025-05-20 |
| 9 | 202541048343-EDUCATIONAL INSTITUTION(S) [20-05-2025(online)].pdf | 2025-05-20 |
| 10 | 202541048343-DRAWINGS [20-05-2025(online)].pdf | 2025-05-20 |
| 11 | 202541048343-DECLARATION OF INVENTORSHIP (FORM 5) [20-05-2025(online)].pdf | 2025-05-20 |
| 12 | 202541048343-COMPLETE SPECIFICATION [20-05-2025(online)].pdf | 2025-05-20 |