Sign In to Follow Application
View All Documents & Correspondence

Wearable Assistive Device For Navigation

Abstract: The present disclosure provides a wearable assistive device for navigation (100) of a visually impaired user. The wearable assistive device includes a clothing (102) adapted to be worn by the user, a light detection and ranging sensor (104), configured to detect position of an obstruction, an imaging sensor (106), configured to detect a set of features pertaining to the obstruction and an output device (108) configured to transmit a set of audio signals to the user, the audio signals pertaining to a set of instructions guiding the user. The wearable assistive device (100) includes a processing unit (110), configured to receive the set of range signals and the set of imaging signals and enabled to transmit a set of output signals to the output device (108). The wearable assistive device (100) is configured to be activated or deactivated by a set of input signals from a switch, enabled to be operated by the user. A power source is configured to provide electric power to the wearable assistive device (100).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 June 2021
Publication Number
10/2023
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
info@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2025-09-17
Renewal Date

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector - 9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. SHIVANI
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
2. VERMA, Vishal
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
3. DIVANSHU
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
4. SHARMA, Bhanu
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
5. PANDEY, Rahul
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
6. MADAN, Jaya
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
7. SHARMA, Deepika
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.

Specification

Description:TECHNICAL FIELD
[0001] The present disclosure relates to the field of wearable assistive device. More particularly, it relates to a wearable assistive device for navigation of visually impaired user.

BACKGROUND
[0002] Background description includes information that may be useful in understanding the present disclosure. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed disclosure, or that any publication specifically or implicitly referenced is prior art.
[0003] Navigation is difficult for visually impaired people as a guiding stick does not provide enough information regarding any approaching obstacle or about the shape, size and structure of the obstruction ahead.
[0004] Therefore there is need of an assistive device which can detect obstruction in path of a visually impaired user and generate warning signals to guide the user along a safe route without colliding with said obstruction. Existing literatures disclose a head mounted laser radar system and a spatial recognition device based on optical sensing and global positioning system. Other available literatures disclose assistive wheelchair with haptic feedback, multiple sensor based steerable guiding stick and range sensor based object detection and mapping facilities. All of these disclosures are expensive devices that continuously monitor the environment and require more power and computational complexity.
[0005] Hence, there is need in the art to develop a simple and easy to use solution which do not need continuous usage of the sensors and hence require less power. The proposed wearable assistive device for navigation is configured to detect presence of obstacle by range sensing. The imaging sensor is activated only when an object is detected by the range sensor. Therefore the proposed method does not require continuous implementation of the complex image processing steps and saves electric power. The proposed device generates audio signals for alerting the user and the device can be activated or deactivated by the user.
OBJECTS OF THE PRESENT DISCLOSURE
[0006] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0007] It is an object of the present disclosure to provide a wearable assistive device for navigation of a user, the wearable assistive device facilitating in a clothing adapted to be worn by a user.
[0008] It is an object of the present disclosure to provide a wearable assistive device that facilitates the clothing to be coupled to a range sensor, configured to detect position of obstruction in the path of the user.
[0009] It is an object of the present disclosure to provide a wearable assistive device that facilitates the clothing to be coupled to an imaging sensor, configured to detect a set of features of the obstruction in the path of the user.
[0010] It is an object of the present disclosure to provide a wearable assistive device that facilitates the clothing to be coupled to an output device, configured to transmit a set of audio signals configured to guide the user.
[0011] It is an object of the present disclosure to provide a wearable assistive device that includes a processing unit, coupled to the range sensor, the imaging sensor and the output device.
[0012] It is an object of the present disclosure to provide a wearable assistive device that enables the processing unit to receive a set of range signals from the range sensor, the range signals pertaining to position of the obstruction.
[0013] It is an object of the present disclosure to provide a wearable assistive device that enables the processing unit to receive a set of image signals from the imaging sensor, the imaging signals pertaining to a set of features of the obstruction.
[0014] It is an object of the present disclosure to provide a wearable assistive device that enables the processing unit to generate a set of output signals pertaining to a set of instructions for guiding the user.
[0015] It is an object of the present disclosure to provide a wearable assistive device that enables the output device to receive the set of output signals from the processing unit and transmit a set of audio signals pertaining to guidance of the user.
[0016] It is an object of the present disclosure to provide a wearable assistive device that facilitates the clothing to include a head covering and footwear configured to support the range sensor, imaging sensor, output device and the processing unit.
[0017] It is an object of the present disclosure to provide a wearable assistive device that enables the processing unit to generate a set of logical signals depending on the set of range signals.
[0018] It is an object of the present disclosure to provide a wearable assistive device that enables the imaging sensor to be activated depending on the set of logical signals facilitated to indicate the presence or absence of obstruction.
[0019] It is an object of the present disclosure to provide a wearable assistive device that enables the processing unit to be communicatively coupled to a switch, configured to be operated by the user.
[0020] It is an object of the present disclosure to provide a wearable assistive device that enables the range sensor, the imaging sensor and the output device to be activated/deactivated by a set of input signals received from the switch.
[0021] It is an object of the present disclosure to provide a wearable assistive device that facilitates a remote control unit to be coupled to the switch and the processing unit.
[0022] It is an object of the present disclosure to provide a wearable assistive device that enables the remote control unit to receive the set of input signals from the switch and transmit the set of input signals to the processing unit.
[0023] It is an object of the present disclosure to provide a wearable assistive device that enables the range sensor, the imaging sensor, the processing unit and the output device to be coupled to a power supply unit.
[0024] It is an object of the present disclosure to provide a wearable assistive device that enables the power supply unit to provide electric power to the wearable assistive device.

SUMMARY
[0025] The present disclosure relates to the field of wearable assistive device. More particularly, it relates to a wearable assistive device for navigation of visually impaired user.
[0026] An aspect of the present disclosure pertains to a wearable assistive device facilitating in a clothing that may be adapted to be worn by a user.
[0027] In an aspect, the clothing may include a head covering and footwear that may be configured to support a range sensor, an imaging device, an output device and a processing unit communicatively coupled to the range sensor, the imaging sensor and the output device.
[0028] In an aspect, the range sensor may be configured to detect obstruction in path of the user.
[0029] In an aspect, the range sensor may include a light detection and ranging sensor that may be configured to transmit a first set of data packets pertaining to position of obstruction to the processing unit.
[0030] In an aspect, the processing unit may be configured to extract a second set of data packets from the received first set of data packets, the second set of data packets pertaining to a set of logical signals.
[0031] In an aspect, the imaging sensor may be enabled to be activated depending on the set of logical signals that may be facilitated to indicate the presence or absence of obstruction.
[0032] In an aspect, the imaging sensor may be configured to capture a set of images of the obstruction.
[0033] In an aspect, the imaging sensor may be configured to transmit a third set of data packets to the processing unit, the third set of data packets pertaining to the set of image signals of the obstruction.
[0034] In an aspect, the processing unit may be configured to extract a fourth set of data packets from the received third set of data packets, the fourth set of data packets pertaining to a set of features of the obstruction.
[0035] In an aspect, the processing unit may be configured to compare the extracted fourth set of data packets with a fifth set of data packets, the fifth set of data packets being stored in a database operatively coupled to the set of processors of the processing unit.
[0036] In an aspect, the fifth set of data packets may pertain to a set of predefined threshold values of a set of dimensions related to the obstruction.
[0037] In an aspect, the output device may be configured to receive a sixth set of data packets from the processing unit and transmit a set of audio signals pertaining to a set of instructions configured to guide the user for navigation.
[0038] In an aspect, the processing unit may be communicatively coupled to a switch that may be configured to be operated by the user.
[0039] In an aspect, the range sensor, the imaging sensor and the output device may be enabled to be activated or deactivated by a set of input signals received from the switch.
[0040] In an aspect, the wearable assistive device may be facilitated to include a remote control unit that may be coupled to the switch and the processing unit.
[0041] In an aspect, the remote control unit may be configured to receive the set of input signals from the switch and transmit the set of input signals to the processing unit.
[0042] In an aspect, the range sensor, the imaging sensor, the processing unit and the output device may be configured to be coupled to a power supply unit.
[0043] In an aspect, the power supply unit may be configured to provide electric power to the wearable assistive device, the power supply unit being configured to include any or a combination of battery, generator, inverter and power line.

BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
[0044] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0045] The diagrams described herein are for illustration only, which thus are not limitations of the present disclosure, and wherein:
[0046] FIG. 1 illustrates exemplary block diagram of the proposed wearable assistive device (100) for navigation in accordance with an embodiment of the present disclosure.
[0047] FIG. 2 illustrates an exemplary block diagram of the functional components of the processing unit (110) associated with the proposed wearable assistive device for navigation in accordance with an embodiment of the present disclosure.
[0048] FIG. 3 illustrates exemplary views of the proposed wearable assistive device (100) for navigation in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[0049] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[0050] If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0051] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0052] While embodiments of the present invention have been illustrated and described in the accompanying drawings, the embodiments are offered only in as much detail as to clearly communicate the disclosure and are not intended to limit the numerous equivalents, changes, variations, substitutions and modifications falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0053] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all groups used in the appended claims.
[0054] Each of the appended claims defines a separate invention, which for infringement purposes is recognized as including equivalents to the various elements or limitations specified in the claims. Depending on the context, all references below to the "invention" may in some cases refer to certain specific embodiments only. In other cases it will be recognized that references to the "invention" will refer to subject matter recited in one or more, but not necessarily all, of the claims.
[0055] Various terms as used herein are shown below. To the extent a term used in a claim is not defined below, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing.
[0056] The present disclosure relates to the field of wearable assistive device. More particularly, it relates to a wearable assistive device for navigation of visually impaired user.
[0057] FIG. 1 illustrates exemplary block diagram of the proposed wearable assistive device for navigation (100) in accordance with an embodiment of the present disclosure.
[0058] In an illustrative embodiment, the proposed wearable assistive device for navigation (100) (interchangeably referred to as device (100), herein) may include a clothing (102), comprising a head covering and a footwear, that may be adapted to be worn by a user.
[0059] In an embodiment, the head covering of the clothing (102) may be in the form of cap, helmet, turban and the likes and footwear may be in the form of formal shoes, casual shoes, sandals, slippers, roller skates and the likes. In an embodiment, the clothing may be adjustable and of free size for use by multiple users, each at a time. In another embodiment, the clothing may be custom made to fit a particular user.
[0060] In an embodiment, the device (100) may include a processing unit (110) communicatively coupled to a range sensor (104), an imaging sensor (106) and an output device (108).
[0061] In an embodiment, the clothing (102) may be coupled to the range sensor (104), the imaging sensor (106), the output device (108) and the processing unit (110). In an exemplary embodiment, the range sensor (104) may be coupled to the footwear. In another embodiment, the imaging sensor (106) may be coupled to the head covering. In an embodiment, the output device (108) may be coupled to the head covering close to the ears of the user.
[0062] In an embodiment, the range sensor (104) may be configured to detect a set of range signals pertaining to position of obstruction in path of the user. In an embodiment, the range sensor (104) may be configured as a light detection and ranging sensor. The range sensor may be enabled to generate a map of the position of obstruction, located within a predetermined distance from the user. In another embodiment, the range sensor (104) may be mounted on a rotating base attached to the footwear of the clothing (102). The rotating base may be configured to rotate the range sensor (104) and cover a predetermined ranging area surrounding the user. The rotating base may be coupled to a set of actuators, configured to be actuated by the processing unit (110).
[0063] In an embodiment, the imaging sensor (106) may be configured to capture a set of image signals pertaining to a set of features of the obstruction. The set of features may include size, shape, structure and the likes of the obstruction.
[0064] In an embodiment, the imaging sensor may be configured to include cameras like but not limited to webcam, pan-tilt camera, mirror-less camera, action camera, compact camera, thermal camera and solar camera. In an embodiment, the imaging sensor may be mounted on a pan-tilt base, configured to tilt and pan the camera across a predetermined range. The pan-tilt base may be facilitated to be controlled by a set of actuators that may be actuated by the processing unit (110).
[0065] In an embodiment, the output device may be configured to transmit a set of audio signals pertaining to a set of instructions for guiding the user to navigate without colliding with the obstruction. The output device (108) may include one or more speakers, earphones, headphones, Bluetooth headsets and the likes. In an embodiment the output device may be configured to have a volume control key for increasing or reducing the volume of the set of audio signals.
[0066] In an embodiment, the processing unit (110) may be communicatively coupled to the range sensor (104), the imaging sensor (106) and the output device (108) by a communication pathway including but not limited to any or a combination of Wi-Fi, Bluetooth, Li-Fi, Zigbee and the likes. In another embodiment, the communication pathway may be a wireless network, a wired network or a combination thereof that may be implemented as one of the different types of networks, such as Local Area Network (LAN), Wide Area Network (WAN) and the likes.
[0067] In an embodiment, the processing unit may be configured to receive the set of range signals from the range sensor (104) and the set of image signals from the imaging sensor (106). The processing unit (100) may be enabled to generate the set of output signals and transmit the set of output signals to the output device.
[0068] In an embodiment, the processing unit (110) may be communicatively coupled to a switch configured to be operated by the user. The processing unit (110) may be configured to receive a set of input signals from the switch. In an embodiment, the set of input signals may pertain to activation or deactivation of the range sensor (104), the imaging sensor (106) and the output device (108).
[0069] In an embodiment, the device (100) may include a remote control unit coupled to the switch. The remote control unit may be configured to receive the set of input signals from the switch and transmit the set of input signals to the processing unit (110). The remote control unit may include a wireless communication unit for transmission of the set of input signals. The communication pathway between the remote control unit and the processing unit (110) may be configured to include but not limited to any or a combination of Wi-Fi, Bluetooth, Li-Fi, Zigbee and the likes. In another embodiment, the communication pathway may be implemented as one of the different types of networks, such as Local Area Network (LAN), Wide Area Network (WAN) and the likes.
[0070] In an embodiment, the switch may be configured in the form of a button including a tact switch, a joystick, a slide switch and the likes. In an embodiment, the switch may be configured to have touch sensitive surface.
[0071] In an embodiment, the range sensor (104), the imaging sensor (106), the processing unit (110) and the output device (108) may be coupled to a power supply unit. The power supply unit may be enabled to provide electric power to the wearable assistive device. In an embodiment, the power supply unit may include one or more power supply sources that may include any or a combination of battery, generator, inverter, power line, and the electric power may be in form of any or a combination of alternate current, direct current and solar current. In an embodiment, the power supply unit may include batteries of the type Lithium Polymer, Lithium Ion, Nickel Cadmiun, Nickel Hydride and the likes.
[0072] FIG. 2 illustrates an exemplary block diagram of the functional components of the processing unit (110) associated with the proposed wearable assistive device for navigation (100) in accordance with an embodiment of the present disclosure.
[0073] In an illustrative embodiment, the processing unit (110) may include one or more processors (202). The one or more processors (202) may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processors (202) may be configured to fetch and execute computer-readable instructions stored in a memory (204), operatively coupled to the one or more processors (202). The memory (204) may be configured to store one or more computer-readable instructions or routines, which may be fetched and executed to generate and share data packets over a communication network or channel. The memory (204) may include any non-transitory storage device including, for example, volatile) memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0074] In an embodiment, the processing unit (110) may also include an interface (206) that can provide a communication pathway among the range sensor (104), the imaging sensor (106), the output device (108) and the likes and the one or more processors (202). The interface (206) may also provide a communication pathway between the one or more processors (202) and other functional components of the processing unit (110) including but not limited to, memory (204) and database (218).
[0075] In an embodiment, the set of processors (202) may be configured to receive a first set of data packets from the range sensor (104). The first set data packets may pertain to the set of range signals, the set of range signals being related to position of the obstruction with respect to the user. In an embodiment, the first set of data packets may include a set of optical signals from a light detection and ranging sensor.
[0076] In an embodiment, the set of processors (202) may include an extraction unit (210), that mat be configured to extract a second set of data packets from the received first set of data packets. The second set of data packets may include a set of logical signals pertaining to the first set of data packets. The set of logical signals may be configured to indicate the presence or absence of obstruction ahead in path of the user. In an embodiment, the second set of data packets may be configured in the form of computer readable binary stream corresponding to position and distance of the obstruction with respect to the user.
[0077] In an embodiment, the set of processors (202) may be configured to receive a third set of data packets from the imaging sensor (106). The third set data packets may pertain to the set of image signals, the set of image signals being related to a set of features of the obstruction. The set of features may include shape, size, structure and the likes. In an embodiment, the third set of data packets may include a set of optical signals from the camera.
[0078] In an embodiment, the extraction unit (210) may be configured to extract a fourth set of data packets from the received third set of data packets. The fourth set of data packets may pertain to the set of dimensions corresponding to the third set of data packets. In an embodiment, the fourth set of data packets may be configured in the form of computer readable binary stream corresponding to the set of dimensions including but not limited to size, shape and structure of obstruction.
[0079] In an embodiment, the set of processors (202) may include a comparison unit (212) that may be configured to compare the extracted fourth set of data packets with a fifth set of data packets. The fifth set of data packets may pertain to a set of predefined threshold values corresponding to the set of dimensions related to the set of features of the obstruction. In an embodiment, the fifth set of data packets may be stored as a look up table in a database (218), operatively coupled to the set of processors (202). The fifth set of data packets may be configured in the form of computer readable binary stream corresponding to predefined threshold values like but not limited to length, breadth, width, thickness, diameter, circumference, area, speed and direction.
[0080] In an embodiment, the set of processors (202) may include an alert generation unit (214) that may be configured to generate a sixth set of data packets pertaining to a set of output signals configured to be transmitted to the output device (108). The sixth set of data packets may be configured in the form of computer readable binary stream corresponding to the set of audio signals. In an embodiment, the set of output signals may be configured as a set of electric pulses. The sixth set of data packets may be transmitted to the output device (108) and then converted to the set of audio signals, configured to guide the user for navigation.
[0081] In an embodiment, the set of processors (202) may include other units (216) that may be configured to implement functionalities that supplement actions performed by the one or more processors (202) of the set of processing units (110). In an exemplary embodiment, such actions may include amplification of the set of range signals, noise removal from the set of imaging signals, automatic calibration of the range and the imaging sensors, volume control of the set of audio signals, actuation of the set of actuators and the likes.
[0082] FIG. 3 illustrates exemplary views of the proposed wearable assistive device for navigation (100) in accordance with an embodiment of the present disclosure.
[0083] In an illustrative embodiment of FIG. 3, the device (100) may include a clothing (102) adapted to be worn by the user. The clothing may include a head covering and a footwear, the clothing being configured to support a range sensor (104) an imaging sensor (106), an output device (108) and a processing unit (110). The processing unit may be configured to receive a set of range signals pertaining to position of obstruction ahead of the user from the range sensor (104) and generate a set of logical values indicating the presence or absence of the obstruction. Depending on the deduced set of logical values, the processing unit (110) may be configured to receive a set of image signals pertaining to a set of features of the obstruction from the imaging sensor (106). The processing unit (110) may be configured to generate a set of output signals that may be transmitted to the output device (108). The output device (108) may be enabled to transmit a set of audio signals pertaining to a set of instructions to guide the user for navigation. The user may be enabled to activate or deactivate the set of range sensor (104), the imaging sensor (106) and the output device (108) by a switch (not shown) that may be configured to transmit a set of input signals from the user to the processing unit (110). The device (100) may be configured to include a power supply unit (not shown), that may be configured to provide electric power to the range sensor (104), the imaging sensor (106), the output device (108) and the processing unit (110).
[0084] As used herein, and unless the context dictates otherwise, the term "coupled to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously. Within the context of this document terms "coupled to" and "coupled with" are also used euphemistically to mean “communicatively coupled with” over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.
[0085] The terms, descriptions and figures used herein are set forth by way of illustration only. Many variations are possible within the spirit and scope of the subject matter, which is intended to be defined by the following claims and their equivalents in which all terms are meant in their broadest reasonable sense unless otherwise indicated.
[0086] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE INVENTION
[0087] The present disclosure provides for a wearable assistive device for navigation of a user, the wearable assistive device facilitating in a clothing adapted to be worn by a user.
[0088] The present disclosure provides for a wearable assistive device that facilitates the clothing to be coupled to a range sensor, configured to detect position of obstruction in the path of the user.
[0089] The present disclosure provides for a wearable assistive device that facilitates the clothing to be coupled to an imaging sensor, configured to detect a set of features of the obstruction in the path of the user.
[0090] The present disclosure provides for a wearable assistive device that facilitates the clothing to be coupled to an output device, configured to transmit a set of audio signals configured to guide the user.
[0091] The present disclosure provides for a wearable assistive device that includes a processing unit, coupled to the range sensor, the imaging sensor and the output device.
[0092] The present disclosure provides for a wearable assistive device that enables the processing unit to receive a set of range signals from the range sensor, the range signals pertaining to position of the obstruction.
[0093] The present disclosure provides for a wearable assistive device that enables the processing unit to receive a set of image signals from the imaging sensor, the imaging signals pertaining to a set of features of the obstruction.
[0094] The present disclosure provides for a wearable assistive device that enables the processing unit to generate a set of output signals pertaining to a set of instructions for guiding the user.
[0095] The present disclosure provides for a wearable assistive device that enables the output device to receive the set of output signals from the processing unit and transmit a set of audio signals pertaining to guidance of the user.
[0096] The present disclosure provides for a wearable assistive device that facilitates the clothing to include a head covering and footwear configured to support the range sensor, imaging sensor, output device and the processing unit.
[0097] The present disclosure provides for a wearable assistive device that enables the processing unit to generate a set of logical signals depending on the set of range signals.
[0098] The present disclosure provides for a wearable assistive device that enables the imaging sensor to be activated depending on the set of logical signals facilitated to indicate the presence or absence of obstruction.
[0099] The present disclosure provides for a wearable assistive device that enables the processing unit to be communicatively coupled to a switch, configured to be operated by the user.
[00100] The present disclosure provides for a wearable assistive device that enables the range sensor, the imaging sensor and the output device to be activated/deactivated by a set of input signals received from the switch.
[00101] The present disclosure provides for a wearable assistive device that facilitates a remote control unit to be coupled to the switch and the processing unit.
[00102] The present disclosure provides for a wearable assistive device that enables the remote control unit to receive the set of input signals from the switch and transmit the set of input signals to the processing unit.
[00103] The present disclosure provides for a wearable assistive device that enables the range sensor, the imaging sensor, the processing unit and the output device to be coupled to a power supply unit.
[00104] The present disclosure provides for a wearable assistive device that enables the power supply unit to provide electric power to the wearable assistive device.

We Claims:

1. A wearable assistive device for navigation (100) of an user, the device (100) comprising:
a clothing (102), adapted to be worn by a user;
a range sensor (104), coupled to the clothing (102), wherein, the range sensor is configured to detect a set of range signals pertaining to position of obstruction in front of the user;
an imaging sensor (106), coupled to the clothing (102), wherein the imaging sensor is configured to detect a set of image signals pertaining to a set of features of the obstruction;
an output device (108), coupled to the clothing (102), wherein the output device is configured to transmit a set of audio signals pertaining to a set of instructions for guiding the user to navigate;
a processing unit (110) communicatively coupled to the range sensor (104), the imaging sensor (106) and the output device (108), wherein the processing unit (110) comprises a set of processors (202) coupled with a memory (204), said memory storing instructions executable by the set of processors to:
receive a first set of data packets from the range sensor (104) wherein the first set data packets correspond to the set of range signals, wherein the set of range signals are related to position of the obstruction with respect to the user;
extract a second set of data packets from the received first set of data packets, wherein the second set of data packets pertain to a set of logical signals pertaining to the first set of data packets;
receive a third set of data packets from the imaging sensor (106), wherein the third set data packets correspond to the set of imaging signals, wherein the set of imaging signals are related to the set of features pertaining to the obstruction;
extract a fourth set of data packets from the received second set of data packets, wherein the fourth set of data packets pertain to a set of dimensions pertaining to the obstruction;
compare the extracted fourth set of data packets with a fifth set of data packets, wherein the fifth set of data packets pertain to a set of predefined threshold values of the set of dimensions pertaining to the obstruction, wherein the fifth set of data packets are stored in a database (218), operatively coupled to the set of processors.
generate a sixth set of data packets pertaining to a set of output signals and transmit the sixth set of data packets to the output device (108) for transmission of audio signals to the user.
2. The wearable assistive device (100) as claimed in claim 1, wherein the clothing (102) includes a head covering and footwear, wherein the head covering and the footwear are enabled to support the imaging sensor (106), the range sensor (104), the output device (108) and the processing unit (110); wherein, the range sensor includes a light detection and ranging sensor .
3. The wearable assistive device (100) as claimed in claim 1, wherein the imaging sensor (106) is configured to be activated upon reception of the set of logical signals generated by the processing unit, wherein the set of logical signals indicate the presence or absence of obstruction.
4. The wearable assistive device (100) as claimed in claim 1, wherein the range sensor (104), the imaging sensor (106) and the output device (108) are activated by a set of input signals received from a switch, wherein the switch is communicatively coupled to the processing unit (110), wherein the switch is operated by the user.
5. The wearable assistive device (100) as claimed in claim 1, wherein the device includes a remote control unit communicatively coupled to the processing unit, wherein the remote control unit is coupled to the switch; wherein the remote control unit upon receiving the set of input signals from the switch, is configured to activate/deactivate the range sensor (104), the imaging sensor (106) and the output device (108).
6. The wearable assistive device (100) as claimed in claim 1, wherein the range sensor (104), the imaging sensor (106), the processing unit (110) and the output device (108) are coupled to a power supply unit, wherein the power supply unit includes any or a combination of battery, generator, inverter, power line, and wherein, the power supply unit is enabled to provide electric power in form of any or a combination of alternate current, direct current and solar current.

Documents

Application Documents

# Name Date
1 202111024976-STATEMENT OF UNDERTAKING (FORM 3) [04-06-2021(online)].pdf 2021-06-04
2 202111024976-POWER OF AUTHORITY [04-06-2021(online)].pdf 2021-06-04
3 202111024976-FORM FOR STARTUP [04-06-2021(online)].pdf 2021-06-04
4 202111024976-FORM FOR SMALL ENTITY(FORM-28) [04-06-2021(online)].pdf 2021-06-04
5 202111024976-FORM 1 [04-06-2021(online)].pdf 2021-06-04
6 202111024976-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [04-06-2021(online)].pdf 2021-06-04
7 202111024976-EVIDENCE FOR REGISTRATION UNDER SSI [04-06-2021(online)].pdf 2021-06-04
8 202111024976-DRAWINGS [04-06-2021(online)].pdf 2021-06-04
9 202111024976-DECLARATION OF INVENTORSHIP (FORM 5) [04-06-2021(online)].pdf 2021-06-04
10 202111024976-COMPLETE SPECIFICATION [04-06-2021(online)].pdf 2021-06-04
11 202111024976-Proof of Right [22-07-2021(online)].pdf 2021-07-22
12 202111024976-FORM 18 [10-03-2023(online)].pdf 2023-03-10
13 202111024976-FER.pdf 2024-01-10
14 202111024976-FER_SER_REPLY [08-07-2024(online)].pdf 2024-07-08
15 202111024976-DRAWING [08-07-2024(online)].pdf 2024-07-08
16 202111024976-CORRESPONDENCE [08-07-2024(online)].pdf 2024-07-08
17 202111024976-CLAIMS [08-07-2024(online)].pdf 2024-07-08
18 202111024976-US(14)-HearingNotice-(HearingDate-20-05-2025).pdf 2025-04-22
19 202111024976-FORM-26 [15-05-2025(online)].pdf 2025-05-15
20 202111024976-Correspondence to notify the Controller [15-05-2025(online)].pdf 2025-05-15
21 202111024976-Written submissions and relevant documents [04-06-2025(online)].pdf 2025-06-04
22 202111024976-PatentCertificate17-09-2025.pdf 2025-09-17
23 202111024976-IntimationOfGrant17-09-2025.pdf 2025-09-17

Search Strategy

1 202111024976E_30-11-2023.pdf

ERegister / Renewals