Abstract: The present disclosure relates to an apparatus (100) for assisting a user, the apparatus includes a frame (102) adapted to be worn on neck part of the user, an image capturing unit (104) captures a set of images of objects in the vicinity of the user and a microcontroller (106) receives, from the image capturing unit, the set of images of objects in the vicinity of the user. The microcontroller extract, from the receive set of images, physical attributes of the objects, compare the extracted physical attributes of the objects with a reference set of attributes to extract a set of values, wherein, upon detection of the set of values of the object in the vicinity of the user, the microcontroller configured to generate an audio signal pertaining to real-time information of the objects is transmitted to an earpiece (110) of the user.
Description:TECHNICAL FIELD
[0001] The present disclosure relates, in general, to an assistance device, and more specifically, relates to an apparatus and method for assisting the visually impaired user.
BACKGROUND
[0002] The human visual system plays a significant role in the recognition of surrounding information. As the visual signal contains more data than auditory information, as the human being perceives information, visual signals are more effective than auditory signals. However, the absence of sensory input limits the identification of information in the case of blind people. It is based on the subject to speak something for a blind person to understand a subject around him.
[0003] Few existing systems were designed based on the same problem, however, these systems suffer from the limitations of bulkiness which means hardware components used by other systems were so bulky that limits their usability and require internet connectivity for transferring information.
[0004] Therefore, there is a need in the art to provide a compact device to transfer real-time information to the visually impaired person by solving the aforementioned problems.
OBJECTS OF THE PRESENT DISCLOSURE
[0005] An object of the present disclosure relates, in general, to an assistance device, and more specifically, relates to an apparatus and method for assisting the visually impaired user.
[0006] Another object of the present disclosure is to provide an apparatus that can be compact and require less hardware components to construct.
[0007] Another object of the present disclosure is to provide an apparatus that can be user-friendly, as the user can wear the apparatus as pendant without carrying any extra luggage.
[0008] Another object of the present disclosure is to provide an apparatus that can be available at affordable cost.
[0009] Another object of the present disclosure is to provide an apparatus that can transfer real-time information to the visually impaired person effectively without internet connectivity.
[0010] Yet another object of the present disclosure is to provide an apparatus that can assist visually impaired user.
SUMMARY
[0011] The present disclosure relates, in general, to an assistance device, and more specifically, relates to an apparatus and method for assisting the visually impaired user.
[0012] In an aspect, the present disclosure provides an apparatus for assisting a user, the apparatus including a frame adapted to be worn on neck part of the user, an image capturing unit configured on the frame of the apparatus to capture a set of images of objects in the vicinity of the user, an earpiece coupled to frame of the apparatus to receive an audio signal, and a microcontroller operatively coupled to the image capturing unit, the microcontroller operatively coupled to the memory, the memory storing instructions executable by the processor to receive, from the image capturing unit, the set of images of objects in the vicinity of the user, extract, from the receive set of images, physical attributes of the objects in the vicinity of the user, compare the extracted physical attributes of the objects with a reference set of attributes to extract a set of values of the object in the vicinity of the user, wherein, upon detection of the set of values of the object in the vicinity of the user, the microcontroller configured to generate an audio signal, the audio signal is transmitted to the earpiece of the user, the audio signal pertaining to real time information of the objects in the vicinity of the user.
[0013] In an embodiment, image capturing unit can be a Bluetooth enabled camera.
[0014] In another embodiment, apparatus comprises light dependent resistor (LDR) coupled to the camera to adjust the shutter speed to the appropriate level.
[0015] In another embodiment, the set of images of the objects can include family member, friend, colleagues, text, books, statue, and tourist place.
[0016] In another embodiment, the reference set of attributes can include pre-defined set of images of the objects.
[0017] In another embodiment, the generated audio signal is transmitted to the earpiece by a communication module.
[0018] In another embodiment, the communication module is a wireless connection module.
[0019] In another embodiment, the generated audio signal comprises name, relation, and job role of the object in the vicinity of the user.
[0020] In an aspect, the present disclosure provides a method for assisting a user, the method comprising capturing, by an image capturing unit, a set of images of objects in the vicinity of the user, the image capturing unit configured on the frame of the apparatus, the frame adapted to be worn on neck part of the user, an earpiece coupled to frame of the apparatus to receive an audio signal; receiving, at a computing device, from the image capturing unit, the set of images of objects in the vicinity of the user; extracting, at the computing device, from the receive set of images, physical attributes of the objects in the vicinity of the user; comparing, at the computing device, the extracted physical attributes of the objects with a reference set of attributes to extract a set of values of the object in the vicinity of the user, wherein, upon detection of the set of values of the object in the vicinity of the user, the computing device configured to generate an audio signal, the audio signal is transmitted to the earpiece of the user, the audio signal pertaining to real time information of the objects in the vicinity of the user.
[0021] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The following drawings form part of the present specification and are included to further illustrate aspects of the present disclosure. The disclosure may be better understood by reference to the drawings in combination with the detailed description of the specific embodiments presented herein.
[0023] FIG. 1A illustrate an exemplary representation of an apparatus for assisting visually impaired user, in accordance with an embodiment of the present disclosure.
[0024] FIG. 1B illustrate an exemplary functional component of the apparatus for assisting visually impaired user, in accordance with an embodiment of the present disclosure.
[0025] FIG. 2 illustrate exemplary flow diagram of a method for assisting visually impaired user, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0026] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0027] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0028] The present disclosure relates, in general, to an assistance device, and more specifically, relates to an apparatus and method for assisting the visually impaired user. The present disclosure recognizes the surrounding information such as person, text, books, statue, tourist place and the like through the camera attached to the pendant of visually impaired person, convert the received data into an audio signal and convey the message to the person via Bluetooth device attached on the pendant. The present disclosure can be described in enabling detail in the following examples, which may represent more than one embodiment of the present disclosure.
[0029] FIG. 1A illustrate an exemplary representation of an apparatus for assisting visually impaired user, in accordance with an embodiment of the present disclosure.
[0030] Referring to FIG. 1A, an apparatus 100 configured to assist in recognizing an object present in the vicinity of the visually impaired user. The apparatus 100 can include a frame 102 that can include an image capturing unit 104, a microcontroller 106, a memory 108, and a light-dependent resistor (LDR) 112 (as illustrated in FIG. 1B and described in detail below). Apparatus 100 can include an earpiece 110, which can be communicatively coupled with the frame 102 of the apparatus 100. The present disclosure can transfer real-time information to the visually impaired person effectively without internet connectivity.
[0031] In an exemplary embodiment, apparatus 100 as presented in the example is a pendant worn by a visually impaired user. As can be appreciated, the present disclosure may not be limited to this configuration but may be extended to other configurations. Frame 102 can be adapted to be worn on the neck part of the user. The image capturing unit 104 captures the image of the object present in the vicinity of the user and transfer the information to the microcontroller 106 to process it further, where the user can be the visually impaired user/person.
[0032] FIG. 1B illustrate an exemplary functional component of the apparatus for assisting visually impaired user, in accordance with an embodiment of the present disclosure.
[0033] In an exemplary embodiment, the image capturing unit 104 is a Bluetooth enabled camera. The Bluetooth enabled camera can be aligned to the direction of the view of the user. The image capturing unit 104 configured to capture a set of images of objects present in the vicinity of the user. The light-dependent resistor (LDR) coupled to the camera to adjust the shutter speed to the appropriate level. The microcontroller 106 operatively coupled to the image capturing unit 104, the microcontroller 106 configured to receive the set of images from the image capturing unit 104.
[0034] In an exemplary embodiment, the microcontroller 106 operatively coupled to the memory 108. Memory 108 can store a pre-defined set of images of all known objects such as family member, friend, colleagues, text, books, statue, tourist place and the likes in the database of the memory 108. The microcontroller 106 receives the captured set of images from the image capturing unit. The microcontroller 106 can extract physical attributes of the objects from the received set of images. The microcontroller 106 can compare the extracted physical attributes with a reference set of attributes to extract a set of values of the object in the vicinity of the user. The reference set of attributes can include a pre-defined set of images in the database of the memory 108.
[0035] The microcontroller 106, upon detection of the set of values of the object in the vicinity of the user, configured to generate an audio signal, the audio signal pertaining to name, relation, job role, and the likes of the object in the vicinity of the user. The generated audio signal can be transmitted to the earpiece by a communication module, where the communication module can be wireless connectivity such as Bluetooth. The earpiece 110 can be attached to the ear of the visually impaired person. The visually impaired person can ask few questions for the confirmation of the right person/object in the vicinity to make it more authentic.
[0036] For example, the pendant of a blind person can be designed on which microcontroller 106 (Bluetooth enabled) camera can be aligned to the direction of the user’s view. The detected image from the camera helps to recognize the face from the stored database. Once face detected then it conveys the audio message to the blind person by revealing the following information such as name, relation, job role and the like.
[0037] The computing device may include microcontroller 106 that can be in communication with each of the memory 108, and input/output units. The microcontroller 106 may include a microprocessor or other devices capable of being programmed or configured to perform computations and instruction processing in accordance with the disclosure. Such other devices may include microcontrollers, digital signal processors (DSP), complex programmable logic device (CPLD), field programmable gate arrays (FPGA), application-specific assimilated circuits (ASIC), discrete gate logic, and/or other assimilated circuits, hardware or firmware in lieu of or in addition to a microprocessor.
[0038] The memory108 can include programmable software instructions that are executed by the processor. The microcontroller 106 may be embodied as a single processor or a number of processors. The microcontroller 106 and memory may each be, for example located entirely within a single computer or other computing device. The memory, which enables storage of data and programs, may include random-access memory (RAM), read-only memory (ROM), flash memory and any other form of readable and writable storage medium.
[0039] The embodiments of the present disclosure described above provide several advantages. The one or more of the embodiments provide an apparatus 100 that can be compact and require less hardware components to construct. The apparatus 100 can be user-friendly, as the user can wear the apparatus 100 as pendant without carrying any extra luggage. The apparatus 100 can be available at affordable cost.
[0040] FIG. 2 illustrate exemplary flow diagram of a method for assisting visually impaired user, in accordance with an embodiment of the present disclosure.
[0041] The method 200 can be implemented using a computing device, which can include one or more processors. The method 200 incudes at block 202 the image capturing unit can capture a set of images of objects in the vicinity of the user, the image capturing unit configured on the frame of the apparatus, the frame adapted to be worn on neck part of the user, an earpiece coupled to frame of the apparatus to receive an audio signal.
[0042] At block 204, the computing device can receive from the image capturing unit, the set of images of objects in the vicinity of the user. At block 206, the computing device can extract from the receive set of images, physical attributes of the objects in the vicinity of the user. At block 208, the computing device can compare the extracted physical attributes of the objects with a reference set of attributes to extract a set of values of the object in the vicinity of the user.
[0043] At block 210 upon detection of the set of values of the object in the vicinity of the user, the computing device configured to generate an audio signal, the audio signal is transmitted to the earpiece of the user, the audio signal pertaining to real-time information of the objects in the vicinity of the user.
[0044] It will be apparent to those skilled in the art that the apparatus 100 of the disclosure may be provided using some or all of the mentioned features and components without departing from the scope of the present disclosure. While various embodiments of the present disclosure have been illustrated and described herein, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the scope of the disclosure, as described in the claims.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0045] The present disclosure provides an apparatus that can be compact and require less hardware components to construct.
[0046] The present disclosure provides an apparatus that can be user-friendly, as the user can wear the apparatus as pendant without carrying any extra luggage.
[0047] The present disclosure provides an apparatus that can be available at affordable cost.
[0048] The present disclosure provides an apparatus that can assist visually impaired user.
[0049] The present disclosure provides an apparatus that can transfer real-time information to the visually impaired person effectively without internet connectivity.
We Claims:
1. An apparatus (100) for assisting a user, the apparatus comprising:
a frame (102) adapted to be worn on neck part of the user;
an image capturing unit (104) configured on the frame of the apparatus to capture a set of images of objects in the vicinity of the user;
an earpiece (110) coupled to frame of the apparatus to receive an audio signal; and
a microcontroller (106) operatively coupled to the image capturing unit, the microcontroller operatively coupled to the memory, the memory (108) storing instructions executable by the processor to:
receive, from the image capturing unit, the set of images of objects in the vicinity of the user;
extract, from the receive set of images, physical attributes of the objects in the vicinity of the user; and
compare the extracted physical attributes of the objects with a reference set of attributes to extract a set of values of the object in the vicinity of the user,
wherein, upon detection of the set of values of the object in the vicinity of the user, the microcontroller configured to generate an audio signal, the audio signal is transmitted to the earpiece of the user, the audio signal pertaining to real time information of the objects in the vicinity of the user.
2. The apparatus as claimed in claim 1, wherein the image capturing unit (104) is a Bluetooth enabled camera.
3. The apparatus as claimed in claim 2, wherein apparatus (100) comprises light dependent resistor (LDR) (112) coupled to the camera to adjust the shutter speed to the appropriate level.
4. The apparatus as claimed in claim 1, wherein the set of images of the objects comprise family member, friend, colleagues, text, books, statue, and tourist place
5. The apparatus as claimed in claim 1, wherein the reference set of attributes comprise pre-defined set of images of the objects.
6. The apparatus as claimed in claim 1, wherein the generated audio signal is transmitted to the earpiece by a communication module.
7. The apparatus as claimed in claim 6, wherein the communication module is a wireless connection mode.
8. The apparatus as claimed in claim 6, wherein the generated audio signal comprises name, relation, and job role of the object in the vicinity of the user.
9. A method (200) for assisting a user, the method comprising:
capturing (202), by an image capturing unit, a set of images of objects in the vicinity of the user, the image capturing unit configured on the frame of the apparatus, the frame adapted to be worn on neck part of the user, an earpiece coupled to frame of the apparatus to receive an audio signal;
receiving (204), at a computing device, from the image capturing unit, the set of images of objects in the vicinity of the user;
extracting (206), at the computing device, from the receive set of images, physical attributes of the objects in the vicinity of the user; and
comparing (208), at the computing device, the extracted physical attributes of the objects with a reference set of attributes to extract a set of values of the object in the vicinity of the user, wherein, upon detection of the set of values of the object in the vicinity of the user, the computing device configured to generate (210) an audio signal, the audio signal is transmitted to the earpiece of the user, the audio signal pertaining to real time information of the objects in the vicinity of the user.
| # | Name | Date |
|---|---|---|
| 1 | 202111023201-STATEMENT OF UNDERTAKING (FORM 3) [25-05-2021(online)].pdf | 2021-05-25 |
| 2 | 202111023201-POWER OF AUTHORITY [25-05-2021(online)].pdf | 2021-05-25 |
| 3 | 202111023201-FORM FOR STARTUP [25-05-2021(online)].pdf | 2021-05-25 |
| 4 | 202111023201-FORM FOR SMALL ENTITY(FORM-28) [25-05-2021(online)].pdf | 2021-05-25 |
| 5 | 202111023201-FORM 1 [25-05-2021(online)].pdf | 2021-05-25 |
| 6 | 202111023201-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [25-05-2021(online)].pdf | 2021-05-25 |
| 7 | 202111023201-EVIDENCE FOR REGISTRATION UNDER SSI [25-05-2021(online)].pdf | 2021-05-25 |
| 8 | 202111023201-DRAWINGS [25-05-2021(online)].pdf | 2021-05-25 |
| 9 | 202111023201-DECLARATION OF INVENTORSHIP (FORM 5) [25-05-2021(online)].pdf | 2021-05-25 |
| 10 | 202111023201-COMPLETE SPECIFICATION [25-05-2021(online)].pdf | 2021-05-25 |
| 11 | 202111023201-Proof of Right [10-07-2021(online)].pdf | 2021-07-10 |
| 12 | 202111023201-FORM 18 [23-02-2023(online)].pdf | 2023-02-23 |
| 13 | 202111023201-FER.pdf | 2024-11-04 |
| 14 | 202111023201-FORM-5 [03-05-2025(online)].pdf | 2025-05-03 |
| 15 | 202111023201-FER_SER_REPLY [03-05-2025(online)].pdf | 2025-05-03 |
| 16 | 202111023201-DRAWING [03-05-2025(online)].pdf | 2025-05-03 |
| 17 | 202111023201-CORRESPONDENCE [03-05-2025(online)].pdf | 2025-05-03 |
| 1 | SearchHistory202111023201E_24-10-2024.pdf |