Sign In to Follow Application
View All Documents & Correspondence

A Method And Device For Guiding A Visually Impaired User Towards A Target Location

Abstract: The present disclosure relates to tactile navigation device for visually impaired user. The method comprises receiving an input data from an external device, the input data comprises directional information to navigate the visually impaired user towards the target location. Further, the method comprises obtaining a first angle from the received directional information. The First angle defines a desired motion trajectory for the visually impaired user to move towards the target location. Further, determining a second angle associated with the current position of the visually impaired user. Calculating a final angle based on the first angle and the second angle. generating a signal based on the final angle. Controlling an actuator based on the generated signal to actuate a direction indicator device. The actuation of the direction indicator device provides tactile based guidance to the visually impaired user to navigate towards the target location. FIG.2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 January 2023
Publication Number
06/2023
Publication Type
INA
Invention Field
PHYSICS
Status
Email
ipo@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-04-22
Renewal Date

Applicants

INDIAN INSTITUTE OF SCIENCE
C V Raman Road, Bengalore-560012, Karnataka, India

Inventors

1. Abhra Roy Chowdhury
Centre for Product Design and Manufacturing, Indian Institute of Science, Gulmohar Marg, Devasandra Layout, Bengaluru, Karnataka 560012, India
2. Vishnu T P
Centre for Product Design and Manufacturing, Indian Institute of Science, Gulmohar Marg, Devasandra Layout, Bengaluru, Karnataka 560012, India

Specification

Description:TECHNICAL FIELD
[0001] Present disclosure generally relates to the field of a navigation device and more particularly, but not exclusively, the present disclosure relates to a device and method for guiding a visually impaired user towards a target location.
BACKGROUND OF THE DISCLOSURE
[0002] Visually impaired persons have significant problems in their everyday movement. Therefore, some existing technologies involve computer vision in developing assistance systems for guiding the visually impaired persons in critical situations. Some of said critical situations include crosswalks on road crossings and stairs in indoor and outdoor environments.
[0003] One of the existing technologies presents an evaluation framework for computer vision-based guidance to the visually impaired persons in such critical situations. However, existing technologies provide verbal commands to visually impaired people which is time consuming and difficult for the visually impaired people to understand and move according to the instructions. Further, few existing technologies describes obstacle avoidance systems to travel smoothly from the start point to destination. However, the currently the existing technologies do not provide any mechanism that is efficient and accurate enough to guide visually impaired people to reach the desired destination.
[0004] The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the disclosure and should not be taken as an acknowledgement or any form of suggestion that this information forms prior art already known to a person skilled in the art.
SUMMARY OF THE DISCLOSURE
[0005] One or more shortcomings of the conventional systems are overcome by device and method as claimed and additional advantages are provided through the provision of device and method as claimed in the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
[0006] In one non-limiting embodiment, the present disclosure describes a method of guiding a visually impaired user towards a target location. The method comprises receiving an input
3
data from an external device, the input data comprises directional information to navigate the visually impaired user towards the target location. Further, the method describes obtaining a first angle from the received directional information. The first angle defines a desired motion trajectory for the visually impaired user to move towards the target location. Further, determining a second angle associated with a current position of the visually impaired user. The method further comprises calculating a final angle based on the first angle and the second angle and generating a signal based on the final angle. Finally, the method describes controlling an actuator based on the generated signal to actuate a direction indicator device. The actuation of the direction indicator device provides a tactile based guidance to the visually impaired user to navigate towards the target location.
[0007] In accordance with the present disclosure, determining the second angle comprises one or more signals received from one or more sensors of the direction indicator device and the second angle is calculated based on the received one or more signals.
[0008] In accordance with the present disclosure, the final angle is determined by calculating a difference between the first angle and the second angle. The final angle represents a corrected angle for desired motion trajectory for the visually impaired user to move towards the target location.
[0009] In accordance with the present disclosure, controlling the actuator based on the generated signal to actuate the direction indicator device comprises actuating a thumb holder cavity of the direction indicator device such that the actuation of the thumb holder cavity provides a tactile based indication to the visually impaired user to navigate towards the target location.
[0010] In accordance with the present disclosure, determining a deviation in a movement of the visually impaired user as compared to the final angle. An alert is provided to the visually impaired user by actuating the thumb holder cavity of the direction indicator device upon the determination of the deviation.
[0011] In accordance with the present disclosure, the received input data further comprises data relating to one or more obstacles.
[0012] In accordance with the present disclosure, data relating to one or more obstacles are obtained from the received input data. The one or more obstacles near to the visually impaired user are determined based on the obtained data relating to one or more obstacles. The alert to
4
the visually impaired user is provided via a vibrational motor associated with the direction indication device based on the determined one or more obstacles.
[0013] In another non-limiting embodiment, the present disclosure describes a direction indicator device for guiding a visually impaired user towards a target location. The direction indicator device comprises a transceiver module, an actuator and a control unit coupled to the actuator and the transceiver module. The control unit is configured to receive an input data from an external device via the transceiver module, the input data comprises a directional information to navigate the visually impaired user towards the target location. The control unit obtains a first angle from the received directional information. The first angle defines a desired motion trajectory for the visually impaired user to move towards the target location. Further, the control unit determines a second angle associated with a current position of the visually impaired user. The control unit calculates a final angle based on the first angle and the second angle and generates a signal based on the final angle. Finally, the control unit controls the actuator based on the generated signal to actuate the direction indicator device, wherein the actuation of the direction indicator device provides a tactile based guidance to the visually impaired user to navigate towards the target location.
[0014] It is to be understood that aspects and embodiments of the disclosure described above may be used in any combination with each other. Several aspects and embodiments may be combined together to form a further embodiment of the disclosure.
[0015] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to drawings and the following detailed description.
EFFECTS/ADVANTAGES OF THE PRESENT INVENTION
[0016] The present disclosure provides a technique for guiding the visually impaired user without any verbal communications or commands. Further the visually impaired user may reach the target location without any manual intervention.
[0017] Another advantage of the present invention is that the device uses the input data which comprises directional information. The input data is passed through a moving average filter to smooth out the data and remove sudden changes that come as errors in control unit. This filtered data which provides the direction in which the user is holding the device is further used to
5
correct the intended direction. This preferred direction is sent to the actuator to actuate a direction indicator device. The actuation of the direction indicator device provides tactile based guidance to the visually impaired user to navigate towards the target location. Thus, the present disclosure provides accurate and efficient as the visually impaired user may reach the target location with minimal time.
BRIEF DESCRIPTION OF THE ACCOMPANYING FIGURES
[0018] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of device and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and with reference to the accompanying figures, in which:
[0019] FIG.1A illustrates an exemplary architecture of a device for guiding a visually impaired user towards a target location, in accordance with some embodiments of the present disclosure;
[0020] FIG.2A illustrates a block diagram of a direction indicator device for guiding a visually impaired user towards a target location, in accordance with some embodiments of the present disclosure;
[0021] FIG. 2B illustrates a representation of direction calculation to guide the visually impaired user towards a target location, in accordance with some embodiments of the present disclosure;
[0022] FIG.3 shows a flowchart illustrating a method performed by the direction indicator device for guiding a visually impaired user towards a target location, in accordance with some embodiments of the present disclosure; and
[0023] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
6
[0024] The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the device illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
[0025] In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily be construed as preferred or advantageous over other embodiments.
[0026] While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
[0027] The terms “comprises”, “comprising”, “includes” or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that includes a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or device or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the device or method.
[0028] The present disclosure relates to guiding a visually impaired user towards a target location. The focus of the present disclosure is to guide the visually impaired user from the initial position till the destination based on one or more angle like first angle, second angle and the final angle. Based on the first angle and the second angle, a final angle may be calculated to generate a signal which may provide tactile based guidance to the visually impaired user to navigate towards the target location. Further, the present disclosure also focuses to avoid accidents that may be caused due to one or more obstacles in the way of visually impaired user. Thus, the visually impaired user may reach the target location without any manual intervention.
7
[0029] A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the disclosure.
[0030] In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0031] FIG.1 shows an exemplary architecture of a system for guiding a visually impaired user towards a target location, in accordance with some embodiments of the present disclosure.
[0032] The system 100 comprises an external device 101 and a direction indicator device 103. The direction indicator device 103 comprises a transceiver module 105, an actuator 107 and a control unit 109. In one non- limiting example, the external device 101 may be a robot, however those of ordinary skill in the art will appreciate that the external device 101 may be any suitable device which is capable of providing data or information that is necessary for navigation purposes, which herein referred as “input data”. In one non limiting example, the input data may include directional information to navigate the visually impaired user towards the target location.
[0033] Generally, visually impaired user cannot recognize the surrounding information through vision, so they may obtain information to reach a target location from an external device 101 and may use one or more aids such as canes, guide block, navistick to reach the target location. The present disclosure describes a direction indicator device that may help the visually impaired user to reach the target location. The direction indicator devices include a cavity which provides tactile based guidance to the visually impaired user to navigate towards the target location. For instance, consider a scenario in which the visually impaired user enters an indoor environment. For ease of understanding, consider that the visually impaired user holds the direction indicator device such as a navistick or a cane which may be connected to external device 101 using transceiver module 107. In an alternative embodiment, the external device 101 tethers the hotspot, or there may be good Wi-Fi access point within the indoor environment.
8
The transceiver module 107 acts as a TCP client to the server and connects to the transceiver module 107 with a given SSID and password. This allows any device connected to that network to communicate with the transceiver module. Once the connection is established, a TCP port is fixed, which is used for further communication. Using the host IP address and the port number, data is received by the direction indicator device from the external device 101.
[0034] The control unit 109 may receive an input data from the external device 101. Further, the control unit 109 may obtain a first angle from the directional information which describes a desired motion trajectory for the visually impaired user to move towards the target location.
The first angle may indicate the direction to reach the target location. Further, the second angle may indicate the current position or the initial position of the visually impaired user.
[0035] Further, the control unit 109 may determine a second angle associated with a current position of the visually impaired user. When the direction indicator device connects with the external device 101 via transceiver module 105, the control unit 109 may determine the current position of the visually impaired user. In one non-limiting example, the control unit may calculate the second angle based on the received one or more signals from one or more sensors of the direction indicator device. Based on the first angle and the calculated second angle, the control unit may calculate a final angle. The final angle may be calculated based on the difference between the first angle and the second angle. The final angle represents a corrected angle for desired motion trajectory for the visually impaired user to move towards the target location.
[0036] In some embodiments, the control unit 109 may generate a signal based the calculated final angle. Further, the control unit 109 controls an actuator 107 to actuate a direction indicator device. The actuation of the direction indicator device provides a tactile based guidance to the visually impaired user to navigate towards the target location. The direction indicator device may include a cavity in which the visually impaired user may place his thumb to sense the tactile based guidance which may navigate the visually impaired user towards the target location.
[0037] FIG.2A shows a detailed block diagram of the device for guiding a visually impaired user towards a target location, in accordance with some embodiments of the present disclosure.
[0038] In some implementations, direction indicator device 201 (which is the direction indicator device 103 of FIG 1) may include data 211 and modules 213. As an example, data
9
211 is stored in memory 209 of the direction indicator device 201 as shown in the FIG.2A. In one embodiment, data 211 may include input data 215, angular data 217, sensor data 219 and other data 221. In the illustrated FIG.2A, module 213 are described herein in detail.
[0039] In an implementation, the direction indicator device 201 may include a transceiver module 207, a control unit 203, and a memory 209. The transceiver module 207 may be configured to communicate internally between control unit 203, accelerators, registers and the like, and also with one or more external sources and/or external equipment associated with the direction indicator device 103. In an embodiment, the memory 207 may be communicatively coupled to the control unit 201. The control unit 203 may be configured to perform one or more functions of the direction indicator device 201.
In some embodiments, the data 211 may be stored in the memory 209 in form of various data structures. Additionally, the data 209 can be organized using data models, such as relational or hierarchical data models. The other data 221 may store data, including temporary data and temporary files, generated by the modules 213 for performing the various functions of the direction indicator device 201 .
[0040] In some embodiments, the input data 215 may correspond to the data or the information that may be received from an external device 101. In one non limiting example, the external device 101 may include, but not limited to, a robot, a third-party device which may provide input data 215 which describes the input data comprises a directional information to navigate the visually impaired user towards the target location. In other words, the directional information may include the direction details which may help the visually impaired user to reach the target location. In an alternative embodiment, based on the received input data 215, data relating to one or more obstacles may be obtained which defines one or more obstacles in a route towards the target location of the visually impaired user.
[0041] In one non-limiting example, the angular data 217 corresponds to data or the information that may be obtained from the directional information of the input data 215. The first angle defines a desired motion trajectory for the visually impaired user to move towards the target location. When the visually impaired user target location is defined, the desired angle i.e., the desired motion trajectory in which the visually impaired user may be navigated is obtained from the directional information. Further, the angular data 217 corresponds to the second angle which defines the current position of the visually impaired user. Based on the obtained first angle and the determined second angle, a final angle may be calculated. In some
10
embodiments, the final angle is calculated by determining a difference between the first angle and the second angle. The final angle represents a corrected angle for desired motion trajectory for the visually impaired user to move towards the target location.
[0042] In one non-limiting example, the sensor data 219 may include values or data obtained from one or more sensors associated with direction indicator device. For instance, when a helping aid such as navistick which is a direction indicator device is held by the visually impaired user gets connected to the external device 101, the direction indicator receives the one or more signals from one or more sensors which may be used to calculate the second angle which defines the current position of the visually impaired user. In some embodiments, the data relating to one or more obstacles may be determined by the one or more sensors associated within the environment of the visually impaired user.
[0043] In some embodiments, the data 211 may be processed by one or more modules 213 of the direction indicator device 201 . As used herein, the term module refers to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a control unit (processor 203) (shared, dedicated, or group) and memory 207 that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality. In an embodiment, the other modules 227 may be used to perform various miscellaneous functionalities of the direction indicator device 201. It will be appreciated that such modules 213 may be represented as a single module or a combination of different modules. In one embodiment, the modules may be a part of the control unit. In an alternative embodiment, the modules may reside inside the memory which may be executed by the control unit to perform the functionalities.
[0044] In some embodiments, the modules 213 may include, for example, a receiving module 223, angular data module 225, signal generation module 227 and other modules 229. The other modules 229 may be used to perform various miscellaneous functionalities of the direction indicator device 201 . It will be appreciated that such aforementioned modules 213 may be represented as a single module or a combination of different modules.
[0045] In some embodiments, the receiving module 223 may receive input data 215 from an external device 101. For instance, the external device 101 may include, but not limited to, a robot, a third-party device. The receiving module 223 may receive input data 215 which comprises a directional information to navigate the visually impaired user towards the target location. For ease of understanding, consider an exemplary scenario in which the visually
11
impaired user enters an indoor environment. The visually impaired user may hold a guiding aid which may be a direction indicator device. When the visually impaired user enters the indoor environment, the guiding aid acting as direction indicator device 201 will be connected to the external robot and may provide the input data 215 comprising the directional information to the receiving module of the directional indicator device. In an alternative embodiment, the receiving module 223 may receive input data which may include data relating to one or more obstacles. Based on the data relating to one or more obstacles, one or more obstacles near to the visually impaired user may be determined.
[0046] In some embodiments, the angular data module 223 may comprises a first angle, second angle and a final angle. The first angle may be obtained from the input data which comprises the directional information. The first angle may define a desired motion trajectory for the visually impaired user to move towards the target location. In other words, when the visually impaired user enters the indoor environment, the desired motion trajectory in which the user can reach the target location may be determined from the first angle. In some embodiments, the angular data module 223 comprises second angle which defines the current position of the visually impaired user. The current position may be obtained from the one or more signals from the one or more sensors associated with the guiding aid of the visually impaired user. The present disclosure is explained below with the help of some exemplary scenarios. However, this should not be construed as a limitation of the present disclosure as the present disclosure may be applicable to other scenarios as well. For example, when the visually impaired user visits a hotel where the guiding aid/ navistick acting as a direction indicator device may determine the initial/ current position of the visually impaired user and may obtain the first angle based on the received input data. Upon considering both first angle and the second angle, the final angle may be calculated.
[0047] The final angle may be determined based on calculating the difference between the first angle and the second angle. The final angle represents a corrected angle for desired motion trajectory for the visually impaired user to move towards the target location. In some embodiments, when the visually impaired user deviates from the final angle path towards the target location then a suitable alert may be provided to guide him/her to reach the target location.
[0048] In some embodiment, all the angles such as first angle, second angle and the final angle are calculated with respect to a particular frame of reference. The external device 101 can easily
12
calculate the orientation of the following visually impaired user with respect to the coordinate frame i.e ϕ as shown in FIG.2B. The control unit is also calibrated based on the fixed coordinate axes. Therefore the input data from control unit provides the orientation of the NAVI-Stick with respect to the fixed axes. The robot acting as an external device 101 sends the input data comprising the direction information to navigate the visually impaired user towards the target location. The first angle which is a desired angle and the second angle is the current position of the user. Based on the above first and second angle, the final angle is calculated. The angular data module 223 may determining a deviation in a movement of the visually impaired user as compared to the final angle. When the deviation is determined, alert is provided to alert to the visually impaired user by actuating the thumb holder cavity of the direction indicator device.
[0049] In some embodiments, a signal generation module 225 may generate a signal based on the final angle. When the final angle is calculated, the signal generation module 225 generates a signal based on the final angle which may be used to actuate an actuator. The actuator embedded in the guiding aid of the visually impaired user actuates and provides a tactile based guidance to the visually impaired user to navigate towards the target location. For example, consider the guiding aid to be a navistick which comprises a cavity in which the visually impaired user may place his/her thumb. When the signal generation module 225 generates the signal based on the final angle, the actuator actuates a thumb holder cavity of the navistick (in the present disclosure the navistick is alternatively used as the direction indicator device or guiding aid) such that the actuation of the thumb holder cavity provides a tactile based indication to the visually impaired user to navigate towards the target location. The actuator 205 may be a servo motor which may actuates a thumb holder cavity of the navistick to provide a tactile based indication to the visually impaired user to navigate towards the target location.
[0050] For example, consider a scenario in which the visually impaired user enters an indoor environment such as a hotel or a museum. The visually impaired user may have a guiding aid/ navistick acting as a direction indicator device. When the visually impaired user holds the direction indicator device, the direction indicator device may get activated and gets connected to the external device 101 or a third-party device. The external or the third-party device can be a robot. The receiving module 221 may receive the input data from the external device 101 which comprises a directional information and data relating to one or more obstacles. Based on the received directional information, the first angle may be obtained which describes the describes the desired motion trajectory for the visually impaired user to move towards the target
13
location. In other words, the angular data module 223 may obtain the first angle based on the directional information which describes the trajectory in which the user has to move to reach the target location. In order to determine the exact path in which the visually impaired user has to travel, the current position of the visually impaired user is determined based on the one or more sensors associated with the direction indicator device. The one or more sensors may provide one or more signals to calculate the second angle. Further, based on the first angle and the second angle, the final angle may be calculated by taking the difference between the first angle and the second angle. The final angle represents a corrected angle for desired motion trajectory for the visually impaired user to move towards the target location. Further, signal generation module 225 may generate a signal based on the final angle which may be used to actuate the direction indicator device. The actuation will be in the cavity of the direction indicator device which provides a tactile based guidance to the visually impaired user to navigate towards the target location. In some embodiments, when there a deviation in a movement of the visually impaired user as compared to the final angle, an alert to the visually impaired user may be provided by actuating the thumb holder cavity of the direction indicator device upon the determination of the deviation. Thus, the actuation of the direction indicator device provides tactile based guidance to the visually impaired user to navigate towards the target location. Thus, the present disclosure provides accurate and efficient as the visually impaired user may reach the target location with minimal time.
[0051] In an alternative embodiment, the input data may also include data relating to one or more obstacles. The data relating to one or more obstacles may be used to determine the one or more obstacles near to the visually impaired user. Upon determining one or more obstacles near to the visually impaired user, an alert to the visually impaired user may be provided via a vibrational motor or a vibrational sensor associated with the direction indication device. Based on the provided alert, the visually impaired user may avoid accidents that may be caused due the presence of one or more obstacles in his/her path. In other words, when there are one or more sensors, the vibrational sensor may vibration in such a way that the visually impaired user may understand and deviate in order to avoid accidents.
[0052] FIG.3 shows a flowchart illustrating a method performed by the direction indicator device 105 for guiding a visually impaired user towards a target location, in accordance with some embodiments of the present disclosure.
[0053] As illustrated in FIG.3, the method 300 comprises one or more blocks illustrating a method of guiding a visually impaired user towards a target location. The method 300 may be
14
described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform functions or implement abstract data types.
[0054] The order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 300 can be implemented in any suitable hardware, software, firmware, or combination thereof.
[0055] At block 301, the method 300 may include receiving, by the control unit 111 associated with the direction indicator device 105, an input data from an external device 103. The input data comprises a directional information to navigate the visually impaired user towards the target location.
[0056] At block 303, the method 300 may include obtaining, by the control unit 111, a first angle from the received directional information. The first angle defines a desired motion trajectory for the visually impaired user to move towards the target location.
[0057] At block 305, the method 300 may include determining, by the control unit 111, a second angle associated with a current position of the visually impaired user. The second angle is calculated based on the received one or more signals from one or more sensors. In a non-limiting embodiment of the present disclosure, wherein the method recites, determining the final angle by calculating a difference between the first angle and the second angle. The final angle represents a corrected angle for desired motion trajectory for the visually impaired user to move towards the target location.
[0058] At block 307, the method 300 may include calculating, by the control unit, a final angle based on the first angle and the second angle. The final angle is calculated based on the difference between the first angle and the second angle. The final angle represents a corrected angle for desired motion trajectory for the visually impaired user to move towards the target location. In a non-limiting embodiment of the present disclosure, wherein the method recites determining a deviation in a movement of the visually impaired user as compared to the final angle and providing an alert to the visually impaired user by actuating the thumb holder cavity of the direction indicator device upon the determination of the deviation.
15
[0059] At block 309, the method 300 may include generating, by the control unit, a signal based on the final angle.
[0060] At block 305, the method 300 may include controlling, by the control unit, an actuator based on the generated signal to actuate a direction indicator device. The actuation of the direction indicator device provides a tactile based guidance to the visually impaired user to navigate towards the target location. In a non-limiting embodiment of the present disclosure, wherein the method recites actuating a thumb holder cavity of the direction indicator device such that the actuation of the thumb holder cavity provides a tactile based indication to the visually impaired user to navigate towards the target location.
Advantages:
The present disclosure provides a technique for guiding the visually impaired user without any verbal communications or commands. Further the visually impaired user may reach the target location without any manual intervention.
Another advantage of the present invention is that the device uses the input data which comprises directional information. The input data is passed through a moving average filter to smooth out the data and remove sudden changes that come as errors in control unit. This filtered data which provides the direction in which the user is holding the device is further used to correct the intended direction. This preferred direction is sent to the actuator to actuate a direction indicator device. The actuation of the direction indicator device provides tactile based guidance to the visually impaired user to navigate towards the target location. Thus, the present disclosure provides accurate and efficient as the visually impaired user may reach the target location with minimal time.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or
16
not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article, or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
The specification has described a method and direction indicator device 103 for generating memory maps. The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that on-going technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments. Also, the words "comprising," "having," "containing," and "including," and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the embodiments of the present invention are intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
Referral Numerals:
Reference Number
Description
17
100
System
101
External device
103, 201
Direction indicator device
105
Transceiver Module
107
Actuator
109
Control unit
113
Memory
211
Data
213
Modules
215
Input data
217
Angular data
219
Sensor data
221
Other data
223
Receiving Module
225
Angular data Module
227
Signal generation module
229
Other modules , C , Claims:1. A method of guiding a visually impaired user towards a target location, the method comprising:
receiving an input data from an external device, the input data comprises a directional information to navigate the visually impaired user towards the target location;
obtaining a first angle from the received directional information, wherein the first angle defines a desired motion trajectory for the visually impaired user to move towards the target location;
determining a second angle associated with a current position of the visually impaired user;
calculating a final angle based on the first angle and the second angle;
generating a signal based on the final angle; and
controlling an actuator based on the generated signal to actuate a direction indicator device, wherein the actuation of the direction indicator device provides a tactile based guidance to the visually impaired user to navigate towards the target location.
2. The method as claimed in claim 1, wherein determining the second angle comprising:
receiving one or more signals from one or more sensors of the direction indicator device; and
calculating the second angle based on the received one or more signals.
3. The method as claimed in claim 1, wherein determining the final angle comprising:
determining the final angle by calculating a difference between the first angle and the second angle, wherein the final angle represents a corrected angle for desired motion trajectory for the visually impaired user to move towards the target location.
4. The method as claimed in claim 1, wherein controlling the actuator based on the generated signal to actuate the direction indicator device, comprising:
actuating a thumb holder cavity of the direction indicator device such that the actuation of the thumb holder cavity provides a tactile based indication to the visually impaired user to navigate towards the target location.
5. The method as claimed in claim 1, further comprising:
determining a deviation in a movement of the visually impaired user as compared to the final angle;
19
providing an alert to the visually impaired user by actuating the thumb holder cavity of the direction indicator device upon the determination of the deviation.
6. The method as claimed in claim 1, wherein the received input data further comprises data relating to one or more obstacles.
7. The method as claimed in claim 6, further comprising:
obtaining data relating to one or more obstacles from the received input data;
determining one or more obstacles near to the visually impaired user based on the obtained data relating to one or more obstacles;
providing an alert to the visually impaired user via a vibrational motor associated with the direction indication device based on the determined one or more obstacles.
8. A direction indicator device for guiding a visually impaired user towards a target location, the direction indication device comprises:
a transceiver module;
an actuator; and
a control unit coupled to the actuator and the transceiver module, wherein the control unit is configured to:
receive an input data from an external device via the transceiver module, the input data comprises a directional information to navigate the visually impaired user towards the target location;
obtain a first angle from the received directional information, wherein the first angle defines a desired motion trajectory for the visually impaired user to move towards the target location;
determine a second angle associated with a current position of the visually impaired user;
calculate a final angle based on the first angle and the second angle;
generate a signal based on the final angle; and
control the actuator based on the generated signal to actuate the direction indicator device, wherein the actuation of the direction indicator device provides a tactile based guidance to the visually impaired user to navigate towards the target location.
9. The device as claimed in claim 8, wherein to determine the second angle, the controller is configured to:
20
receive one or more signals from one or more sensors of the direction indicator device; and
calculate the second angle based on the received one or more signals.
10. The device as claimed in claim 8, wherein to determine the final angle the controller is configured to:
determine the final angle by calculating a difference between the first angle and the second angle, wherein the final angle represents a corrected angle for desired motion trajectory for the visually impaired user to move towards the target location.
11. The device as claimed in claim 8, wherein to control the actuator based on the generated signal to actuate the direction indicator device, the controller is configured to:
actuate a thumb holder cavity of the direction indicator device such that the actuation of the thumb holder cavity provides a tactile based indication to the visually impaired user to navigate towards the target location.
12. The device as claimed in claim 8, wherein the controller is further configured to:
determine a deviation in a movement of the visually impaired user as compared to the final angle;
provide an alert to the visually impaired user by actuating the thumb holder cavity of the direction indicator device upon the determination of the deviation.
13. The device as claimed in claim 8, wherein the received input data further comprises data relating to one or more obstacles.
14. The device as claimed in claim 13, wherein the controller is further configured to:
obtain data relating to one or more obstacles from the received input data;
determine one or more obstacles near to the visually impaired user based on the obtained data relating to one or more obstacles;
provide an alert to the visually impaired user via a vibrational motor associated with the direction indication device based on the determined one or more obstacles.

Documents

Application Documents

# Name Date
1 202341006227-IntimationOfGrant22-04-2024.pdf 2024-04-22
1 202341006227-STATEMENT OF UNDERTAKING (FORM 3) [31-01-2023(online)].pdf 2023-01-31
2 202341006227-REQUEST FOR EARLY PUBLICATION(FORM-9) [31-01-2023(online)].pdf 2023-01-31
2 202341006227-PatentCertificate22-04-2024.pdf 2024-04-22
3 202341006227-Response to office action [27-02-2024(online)].pdf 2024-02-27
3 202341006227-POWER OF AUTHORITY [31-01-2023(online)].pdf 2023-01-31
4 202341006227-FORM-9 [31-01-2023(online)].pdf 2023-01-31
4 202341006227-CLAIMS [28-07-2023(online)].pdf 2023-07-28
5 202341006227-FORM FOR SMALL ENTITY(FORM-28) [31-01-2023(online)].pdf 2023-01-31
5 202341006227-FER_SER_REPLY [28-07-2023(online)].pdf 2023-07-28
6 202341006227-OTHERS [28-07-2023(online)].pdf 2023-07-28
6 202341006227-FORM 1 [31-01-2023(online)].pdf 2023-01-31
7 202341006227-FORM-8 [25-05-2023(online)].pdf 2023-05-25
7 202341006227-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [31-01-2023(online)].pdf 2023-01-31
8 202341006227-Proof of Right [05-04-2023(online)].pdf 2023-04-05
8 202341006227-EDUCATIONAL INSTITUTION(S) [31-01-2023(online)].pdf 2023-01-31
9 202341006227-FER.pdf 2023-03-15
9 202341006227-DRAWINGS [31-01-2023(online)].pdf 2023-01-31
10 202341006227-DECLARATION OF INVENTORSHIP (FORM 5) [31-01-2023(online)].pdf 2023-01-31
10 202341006227-EVIDENCE OF ELIGIBILTY RULE 24C1h [01-02-2023(online)].pdf 2023-02-01
11 202341006227-COMPLETE SPECIFICATION [31-01-2023(online)].pdf 2023-01-31
11 202341006227-FORM 18A [01-02-2023(online)].pdf 2023-02-01
12 202341006227-COMPLETE SPECIFICATION [31-01-2023(online)].pdf 2023-01-31
12 202341006227-FORM 18A [01-02-2023(online)].pdf 2023-02-01
13 202341006227-DECLARATION OF INVENTORSHIP (FORM 5) [31-01-2023(online)].pdf 2023-01-31
13 202341006227-EVIDENCE OF ELIGIBILTY RULE 24C1h [01-02-2023(online)].pdf 2023-02-01
14 202341006227-DRAWINGS [31-01-2023(online)].pdf 2023-01-31
14 202341006227-FER.pdf 2023-03-15
15 202341006227-EDUCATIONAL INSTITUTION(S) [31-01-2023(online)].pdf 2023-01-31
15 202341006227-Proof of Right [05-04-2023(online)].pdf 2023-04-05
16 202341006227-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [31-01-2023(online)].pdf 2023-01-31
16 202341006227-FORM-8 [25-05-2023(online)].pdf 2023-05-25
17 202341006227-FORM 1 [31-01-2023(online)].pdf 2023-01-31
17 202341006227-OTHERS [28-07-2023(online)].pdf 2023-07-28
18 202341006227-FER_SER_REPLY [28-07-2023(online)].pdf 2023-07-28
18 202341006227-FORM FOR SMALL ENTITY(FORM-28) [31-01-2023(online)].pdf 2023-01-31
19 202341006227-FORM-9 [31-01-2023(online)].pdf 2023-01-31
19 202341006227-CLAIMS [28-07-2023(online)].pdf 2023-07-28
20 202341006227-Response to office action [27-02-2024(online)].pdf 2024-02-27
20 202341006227-POWER OF AUTHORITY [31-01-2023(online)].pdf 2023-01-31
21 202341006227-REQUEST FOR EARLY PUBLICATION(FORM-9) [31-01-2023(online)].pdf 2023-01-31
21 202341006227-PatentCertificate22-04-2024.pdf 2024-04-22
22 202341006227-STATEMENT OF UNDERTAKING (FORM 3) [31-01-2023(online)].pdf 2023-01-31
22 202341006227-IntimationOfGrant22-04-2024.pdf 2024-04-22

Search Strategy

1 202341006227E_28-02-2023.pdf

ERegister / Renewals

3rd: 05 Jun 2024

From 31/01/2025 - To 31/01/2026

4th: 05 Jun 2024

From 31/01/2026 - To 31/01/2027

5th: 05 Jun 2024

From 31/01/2027 - To 31/01/2028

6th: 05 Jun 2024

From 31/01/2028 - To 31/01/2029

7th: 05 Jun 2024

From 31/01/2029 - To 31/01/2030

8th: 05 Jun 2024

From 31/01/2030 - To 31/01/2031

9th: 05 Jun 2024

From 31/01/2031 - To 31/01/2032

10th: 05 Jun 2024

From 31/01/2032 - To 31/01/2033