Sign In to Follow Application
View All Documents & Correspondence

Wearable Assistance Device For Deaf User

Abstract: A wearable assistance device for a deaf user is provided. The device facilitates to receive sounds in vicinity of the deaf user using a microphone and provides a classification of the received sounds using a voice recognition module. Upon the classification of at least one of the received sounds in a particular predefined classification category, the at least one of the received sound is determined as a warning sound. Further, a vibrator operatively coupled with the wearable assistance device is activated to alert the deaf user to the warning sound. The device captures the received sounds such that speech from a person talking to the deaf user and determined from the captured received sounds is translated into a text for the deaf user. This is done using a speech to text translator operatively coupled with the device.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 July 2020
Publication Number
05/2022
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
info@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-02-09
Renewal Date

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector - 9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. SHRIVASTAVA, Anchit
#43, Skynet Enclave, Lohgarh Road, Zirakpur - 140603, Punjab, India.
2. GUPTA, Sheifali
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
3. GUPTA, Deepali
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
4. GUPTA, Rupesh
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
5. VERMA, Vishal
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
6. GUPTA, Kamali
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
7. GOYAL, Rakesh
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
8. GUPTA, Raman
House Number 203, Sector 2, Kurukshetra - 136118, Haryana, India.

Specification

[0001] The present invention relates generally to a wearable smart device for
notifying a deaf user. More particularly, a wearable assistance device of the present disclosure
facilitates assistance and life management for deaf people.
BACKGROUND
[0002] Certain individuals having a lack of sensory capabilities, such as a hearing
disabled individual, cannot detect certain information in their environment in the same
manner as an individual having all of his senses impact. Because of this, these individuals
may not become aware of any notifications or when they are being addressed because of their
lost or muted senses. These individuals may be at a higher risk of danger because they cannot
detect the notifications if any generated for them in their surroundings.
[0003] Further, the individuals lacking sensory capabilities i.e. deaf people do not
hear information sounds like a doorbell or beep of a microwave oven. Worst of all, they do
not hear speech of other people. This makes communication with other people very difficult
and frustrating. Because of this deaf people always feel excluded from meetings due to lack
of proper communication and have this constant feeling that someone will come from behind
them resulting them to get scared if someone touches them from back suddenly.
[0004] Thus, there is a need for a wearable device that detects information and sounds
corresponding to a deaf user's environment and converts that information into feedback that
can be detected by a deaf user.
OBJECTS OF THE PRESENT DISCLOSURE
[0005] Some of the objects of the present disclosure aimed to ameliorate one or more
problems of the prior art or to at least provide a useful alternative are listed herein below.
[0006] An object of the present disclosure is to provide a device that facilitates to
notify a deaf user when someone calls their names.
[0007] Another object of the present disclosure is to provide a device that facilitates
to convert and display captured sounds into a text format for a deaf user.
[0008] Another object of the present disclosure is to provide a device that is wearable
so as to facilitate providing notifications to a deaf user.
[0009] Another object of the present disclosure is to provide a device that facilitates a
3
deaf user to provide messages in a text format to be converted to speech for notifying other
users.
SUMMARY OF THE INVENTION
[00010] The present invention relates generally to a wearable smart device for
notifying a deaf user. More particularly, a wearable assistance device of the present disclosure
facilitates assistance and life management for deaf people.
[00011] According to an aspect of the present disclosure is provided, a wearable
assistance device for a deaf user, said device comprising: a memory and a hardware processor
configured to : receive sounds in vicinity of the deaf user using a microphone and provide a
classification of the received sounds using a voice recognition module, upon the classification
of at least one of the received sounds in a particular predefined classification category,
determine the at least one of the received sound as a warning sound, activate a vibrator
operatively coupled with the wearable assistance device to alert the deaf user to the warning
sound, and capture the received sounds such that speech from a person talking to the deaf
user and determined from the captured received sounds is translated, using a speech to text
translator operatively coupled with the device, into a text for the deaf user.
[00012] According to an embodiment, the device further comprises any or a
combination of a keypad input device and a touch input device.
[00013] According to an embodiment, the device is configured to distinguish between
the speech and the warning sounds.
[00014] According to an embodiment, the particular predefined category is any of a
name, and a specific code associated with the deaf user.
[00015] According to an embodiment, the device further comprises a text to speech
translator for converting a typed message into a speech message.
[00016] According to an embodiment, the speech to text translator is equipped to
execute for multiple languages.
[00017] According to an embodiment, the device further comprises a GPS receiver
configured to obtain location of the deaf user.
[00018] According to an embodiment, the device further comprises a SOS button that
is used to communicate a help message to a set of predefined users upon the detection of fall
of the deaf user.
[00019] According to an embodiment, the memory is configured to store a set of
predefined credentials of the deaf user, wherein the predefined credentials correspond to any
4
or a combination of physical statistics, health records and contact numbers.
[00020] Various objects, features, aspects and advantages of the present disclosure will
become more apparent from the following detailed description of preferred embodiments,
along with the accompanying drawing figures in which like numerals represent like features.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
[00021] In the figures, similar components and/or features may have the same
reference label. Further, various components of the same type may be distinguished by
following the reference label with a second label that distinguishes among the similar
components. If only the first reference label is used in the specification, the description is
applicable to any one of the similar components having the same first reference label
irrespective of the second reference label.
[00022] FIG. 1 illustrates a wearable assistance device for notifying a deaf user in
accordance with an embodiment of the present disclosure.
[00023] FIG. 2 illustrates exemplary functional components of the wearable assistance
device in accordance with an embodiment of the present disclosure.
[00024] FIG. 3 is a high-level flow diagram illustrating working of the wearable
assistance device in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[00025] In the following description, numerous specific details are set forth in order to
provide a thorough understanding of embodiments of the present invention. It will be
apparent to one skilled in the art that embodiments of the present invention may be practiced
without some of these specific details.
[00026] Embodiments of the present invention include various steps, which will be
described below. The steps may be performed by hardware components or may be embodied
in machine-executable instructions, which may be used to cause a general-purpose or specialpurpose processor programmed with the instructions to perform the steps. Alternatively, steps
may be performed by a combination of hardware, software, firmware and/or by human
operators.
[00027] Embodiments of the present invention may be provided as a computer
program product, which may include a machine-readable storage medium tangibly
embodying thereon instructions, which may be used to program a computer (or other
electronic devices) to perform a process. The machine-readable medium may include, but is
5
not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc
read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such
as ROMs, PROMs, random access memories (RAMs), programmable read-only memories
(PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash
memory, magnetic or optical cards, or other type of media/machine-readable medium suitable
for storing electronic instructions (e.g., computer programming code, such as software or
firmware).
[00028] Various methods described herein may be practiced by combining one or more
machine-readable storage media containing the code according to the present invention with
appropriate standard computer hardware to execute the code contained therein. An apparatus
for practicing various embodiments of the present invention may involve one or more
computers (or one or more processors within a single computer) and storage systems
containing or having network access to computer program(s) coded in accordance with
various methods described herein, and the method steps of the invention could be
accomplished by modules, routines, subroutines, or subparts of a computer program product.
[00029] If the specification states a component or feature “may”, “can”, “could”, or
“might” be included or have a characteristic, that particular component or feature is not
required to be included or have the characteristic.
[00030] As used in the description herein and throughout the claims that follow, the
meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates
otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on”
unless the context clearly dictates otherwise.
[00031] Exemplary embodiments will now be described more fully hereinafter with
reference to the accompanying drawings, in which exemplary embodiments are shown. This
invention may, however, be embodied in many different forms and should not be construed as
limited to the embodiments set forth herein. These embodiments are provided so that this
invention will be thorough and complete and will fully convey the scope of the invention to
those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the
invention, as well as specific examples thereof, are intended to encompass both structural and
functional equivalents thereof. Additionally, it is intended that such equivalents include both
currently known equivalents as well as equivalents developed in the future (i.e., any elements
developed that perform the same function, regardless of structure).
[00032] While embodiments of the present invention have been illustrated and
described, it will be clear that the invention is not limited to these embodiments only.
6
Numerous modifications, changes, variations, substitutions, and equivalents will be apparent
to those skilled in the art, without departing from the spirit and scope of the invention, as
described in the claim.
[00033] The present invention relates generally to a wearable smart device for
notifying a deaf user. More particularly, a wearable assistance device of the present disclosure
facilitates assistance and life management for deaf people.
[00034] According to an aspect of the present disclosure is provided, a wearable
assistance device for a deaf user, said device comprising: a memory and a hardware processor
configured to : receive sounds in vicinity of the deaf user using a microphone and provide a
classification of the received sounds using a voice recognition module, upon the classification
of at least one of the received sounds in a particular predefined classification category,
determine the at least one of the received sound as a warning sound, activate a vibrator
operatively coupled with the wearable assistance device to alert the deaf user to the warning
sound, and capture the received sounds such that speech from a person talking to the deaf
user and determined from the captured received sounds is translated, using a speech to text
translator operatively coupled with the device, into a text for the deaf user.
[00035] According to an embodiment, the device further comprises any or a
combination of a keypad input device and a touch input device.
[00036] According to an embodiment, the device is configured to distinguish between
the speech and the warning sounds.
[00037] According to an embodiment, the particular predefined category is any of a
name, and a specific code associated with the deaf user.
[00038] According to an embodiment, the device further comprises a text to speech
translator for converting a typed message into a speech message.
[00039] According to an embodiment, the speech to text translator is equipped to
execute for multiple languages.
[00040] According to an embodiment, the device further comprises a GPS receiver
configured to obtain location of the deaf user.
[00041] According to an embodiment, the device further comprises a SOS button that
is used to communicate a help message to a set of predefined users upon the detection of fall
of the deaf user.
[00042] According to an embodiment, the memory is configured to store a set of
predefined credentials of the deaf user, wherein the predefined credentials correspond to any
or a combination of physical statistics, health records and contact numbers.
7
[00043] Referring to the drawings, the invention will now be described in more detail.
[00044] FIG. 1 illustrates a wearable assistance device for notifying a deaf user in
accordance with an embodiment of the present disclosure.
[00045] In an aspect, the wearable assistance device 100 may comprise one or more
processor(s) 202. The one or more processor(s) 202 may be implemented as one or more
microprocessors, microcomputers, microcontrollers, digital signal processors, central
processing units, logic circuitries, and/or any devices that manipulate data based on
operational instructions. Among other capabilities, the one or more processor(s) 202 are
configured to fetch and execute computer-readable instructions stored in a memory of the
device 100. The memory may store one or more computer-readable instructions or routines,
which may be fetched and executed to create or share the data units over a network service.
The memory may include any non-transitory storage device including, for example, volatile
memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[00046] The device 100 may also comprise an interface(s). The interface(s) may
comprise a variety of interfaces, for example, interfaces for data input and output devices,
referred to as I/O devices, storage devices, and the like. The interface(s) may facilitate
communication of device. The interface(s) may also provide a communication pathway for
one or more components of the processing engine. Examples of such components include, but
are not limited to, processing engine(s) and database.
[00047] The processing engine(s) may be implemented as a combination of hardware
and programming (for example, programmable instructions) to implement one or more
functionalities of the processing engine(s). The processing engine(s) is stored on the memory
and runs on the processor(s). In examples described herein, such combinations of hardware
and programming may be implemented in several different ways. For example, the
programming for the processing engine(s) may be processor executable instructions stored on
a non-transitory machine-readable storage medium and the hardware for the processing
engine(s) may comprise a processing resource (for example, one or more processors), to
execute such instructions. In the present examples, the machine-readable storage medium
may store instructions that, when executed by the processing resource, implement the
processing engine(s). In such examples, the device 100 may comprise the machine-readable
storage medium storing the instructions and the processing resource to execute the
instructions, or the machine-readable storage medium may be separate but accessible to
device 100 and the processing resource. In other examples, the processing engine(s) may be
implemented by electronic circuitry.
8
[00048] The database may comprise data that is either stored or generated as a result of
functionalities implemented by any of the components of the processing engine(s) or the
device 100.
[00049] In an embodiment, the device 100 includes a display 104 as the interface, and
an input interface such as a touch screen 106. Also, included is a speaker 108 for outputting
text as entered by a deaf user in an audible form. The device 100 includes a vibrator motor
112. The vibrator motor vibrates when the name of the deaf user as called by another speaker
is correct. The deaf user is notified by a vibration action of the vibrator motor 112. The
device includes a microphone 116, for taking the user inputs and providing to the device 100.
The voice recognition unit 114 has a voice recognition capability that recognizes the voice
and compares it with a predefined set of names. Upon the names being correct, the deaf user
is notified. Also, a speech to text and a text to speech unit is provided for requisite functions
by the deaf user.
[00050] In an embodiment, the device is used to receive sounds in vicinity of the deaf
user using the microphone 116 and provide a classification of the received sounds using a
voice recognition unit 114. Upon the classification of at least one of the received sounds in a
particular predefined classification category, determine the at least one of the received sound
as a warning sound. The particular predefined category is any of a name, and a specific code
associated with the deaf user.
[00051] The device activates a vibrator operatively coupled with the wearable
assistance device to alert the deaf user to the warning sound. Further, the device captures the
received sounds such that speech from a person talking to the deaf user and determined from
the captured received sounds translates using a speech to text translator operatively coupled
with the device, into a text for the deaf user. The speech to text translator is equipped to
execute for multiple languages.
[00052] In addition, the device may further include a text to speech translator for
converting a typed message into a speech message. The device may further include a GPS
receiver configured to obtain location of the deaf user, and the device is configured to detect a
fall of the deaf user. The device may include any or a combination of a keypad input device
and a touch input device, and may be configured to distinguish between the speech and the
warning sounds. In an embodiment, the device may be equipped to execute for multiple
languages. In an embodiment, the device further includes a SOS button that may be used to
communicate a help message to a set of predefined users upon the detection of fall of the deaf
user.
9
[00053] FIG. 2 illustrates exemplary functional components of the wearable assistance
device in accordance with an embodiment of the present disclosure.
[00054] According to an embodiment, the wearable assistance device may connect to a
network for communicating with other similar devices. The network can be implemented as
one of the different types of networks, such as intranet, local area network (LAN), wide area
network (WAN), the internet, Wi-Fi, LTE network, CDMA network, and the like. Further, the
network can either be a dedicated network or a shared network. The shared network
represents an association of the different types of networks that use a variety of protocols, for
example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet
Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with
one another. Further the network 118 can include a variety of network devices, including
routers, bridges, servers, computing devices, storage devices, and the like.
[00055] In an embodiment, at block 202, an input from a speaker is received by the
device for the deaf user. At block 204, a matching of the input name from the speaker is
compared with a set of predefined names. And upon determination of matching at block 204,
a vibratory motor at block 206 is activated, and a speech to text translator is activated by the
device and the received message is displayed on screen of the device. Also, at block 210, text
as inputted by the deaf user can be converted to speech.
[00056] FIG. 3 is a high-level flow diagram illustrating working of the wearable
assistance device in accordance with an embodiment of the present disclosure.
[00057] Embodiments of the present disclosure may be implemented entirely
hardware, entirely software (including firmware, resident software, micro-code, etc.) or
combining software and hardware implementation that may all generally be referred to herein
as a “circuit,” “module,” “component,” or “system.” Furthermore, aspects of the present
disclosure may take the form of a computer program product comprising one or more
computer readable media having computer readable program code embodied thereon.
[00058] In an embodiment, the wearable assistance device may have a voice
recognition module 302 connected to a display unit 304. Further, the device also includes a
vibrator 306. As is illustrated at Step 1, upon a user calling a deaf user named „Mike‟, the
voice recognition module can capture and determine the name. At Step 2, if the input as
determined by the voice recognition module 302 matches a predefined name, the vibrator
motor 306 is set as „ON‟. Further, if the input does not match the predefined name, at step 4,
the device is ready to obtain next set of input.
[00059] In an embodiment, if the input matches the name at step 2, at step 5 additional
10
inputs are captured to determine what additional information is being said after the name has
been taken, and the same is communicated to the deaf user named „Mike‟.
[00060] In an embodiment, the device uses the microphone from which input is taken.
The input is processed in a voice recognition module. If the input is matched with the
predefined set of names, the device enables the vibrators to vibrate. While the vibrators are
busy informing the user, the microphone may take more inputs to know what is said after the
name has been taken. With the help of speech to text translator the deaf user will have that
message displayed on the screen of the device. The deaf user can also manually enable
speech to text translator for enhancing feasibility in communication if required. They also
have an option to use the on screen keyboard to type their message which will be converted
to speech using a speaker. In an additional embodiment, the device may be made for
interpreting multiple languages.
[00061] Thus, it will be appreciated by those of ordinary skill in the art that the
diagrams, schematics, illustrations, and the like represent conceptual views or processes
illustrating systems and methods embodying this invention. The functions of the various
elements shown in the figures may be provided through the use of dedicated hardware as well
as hardware capable of executing associated software. Similarly, any switches shown in the
figures are conceptual only. Their function may be carried out through the operation of
program logic, through dedicated logic, through the interaction of program control and
dedicated logic, or even manually, the particular technique being selectable by the entity
implementing this invention. Those of ordinary skill in the art further understand that the
exemplary hardware, software, processes, methods, and/or operating systems described
herein are for illustrative purposes and, thus, are not intended to be limited to any particular
named.
[00062] As used herein, and unless the context dictates otherwise, the term "coupled
to" is intended to include both direct coupling (in which two elements that are coupled to
each other contact each other) and indirect coupling (in which at least one additional element
is located between the two elements). Therefore, the terms "coupled to" and "coupled with"
are used synonymously. Within the context of this document terms "coupled to" and "coupled
with" are also used euphemistically to mean “communicatively coupled with” over a
network, where two or more devices are able to exchange data with each other over the
network, possibly via one or more intermediary device.
[00063] It should be apparent to those skilled in the art that many more modifications
besides those already described are possible without departing from the inventive concepts
11
herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the
appended claims. Moreover, in interpreting both the specification and the claims, all terms
should be interpreted in the broadest possible manner consistent with the context. In
particular, the terms “comprises” and “comprising” should be interpreted as referring to
elements, components, or steps in a non-exclusive manner, indicating that the referenced
elements, components, or steps may be present, or utilized, or combined with other elements,
components, or steps that are not expressly referenced. Where the specification claims refers
to at least one of something selected from the group consisting of A, B, C …. and N, the text
should be interpreted as requiring only one element from the group, not A plus N, or B plus
N, etc.
[00064] While the foregoing describes various embodiments of the invention, other
and further embodiments of the invention may be devised without departing from the basic
scope thereof. The scope of the invention is determined by the claims that follow. The
invention is not limited to the described embodiments, versions or examples, which are
included to enable a person having ordinary skill in the art to make and use the invention
when combined with information and knowledge available to the person having ordinary skill
in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[00065] The present disclosure provides a device that facilitates to notify a deaf user
when someone calls their names.
[00066] The present disclosure provides a device that facilitates to convert and display
captured sounds into a text format for a deaf user.
[00067] The present disclosure provides a device that is wearable so as to facilitate
providing notifications to a deaf user.
[00068] The present disclosure provides a device that facilitates a deaf user to provide
messages in a text format to be converted to speech for notifying other users.

We Claim:

1. A wearable assistance device for a deaf user, said device comprising:
a memory and a hardware processor configured to :
receive sounds in vicinity of the deaf user using a microphone and
provide a classification of the received sounds using a voice recognition
module,
upon the classification of at least one of the received sounds in a particular
predefined classification category, determine the at least one of the received
sound as a warning sound,
activate a vibrator operatively coupled with the wearable assistance device to
alert the deaf user to the warning sound, and
capture the received sounds such that speech from a person talking to the deaf
user and determined from the captured received sounds is translated, using a
speech to text translator operatively coupled with the device, into a text for the
deaf user.
2. The device as claimed in claim 1, wherein the device further comprises any or a
combination of a keypad input device and a touch input device.
3. The device as claimed in claim 1, wherein the device is configured to distinguish
between the speech and the warning sounds.
4. The device as claimed in claim 1, wherein the particular predefined category is any of
a name, and a specific code associated with the deaf user.
5. The device as claimed in claim 1, wherein the device further comprises a text to
speech translator for converting a typed message into a speech message.
6. The device as claimed in claim 1, wherein the speech to text translator is equipped to
execute for multiple languages.
7. The device as claimed in claim 1, wherein the device further comprises a GPS
receiver configured to obtain location of the deaf user.
8. The device as claimed in claim 1, wherein the device is configured to detect a fall of
the deaf user.
9. The device as claimed in claim 8, wherein the device further comprises a SOS button
that is used to communicate a help message to a set of predefined users upon the
detection of fall of the deaf user.
13
10. The device as claimed in claim 1, wherein the memory is configured to store a set of
predefined credentials associated with the deaf user, wherein the predefined
credentials comprises any or a combination of physical statistics, health records and
contact numbers.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 202011032976-IntimationOfGrant09-02-2024.pdf 2024-02-09
1 202011032976-STATEMENT OF UNDERTAKING (FORM 3) [31-07-2020(online)].pdf 2020-07-31
2 202011032976-FORM FOR STARTUP [31-07-2020(online)].pdf 2020-07-31
2 202011032976-PatentCertificate09-02-2024.pdf 2024-02-09
3 202011032976-FORM FOR SMALL ENTITY(FORM-28) [31-07-2020(online)].pdf 2020-07-31
3 202011032976-Annexure [03-02-2024(online)].pdf 2024-02-03
4 202011032976-Written submissions and relevant documents [03-02-2024(online)].pdf 2024-02-03
4 202011032976-FORM 1 [31-07-2020(online)].pdf 2020-07-31
5 202011032976-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [31-07-2020(online)].pdf 2020-07-31
5 202011032976-Correspondence to notify the Controller [16-01-2024(online)].pdf 2024-01-16
6 202011032976-FORM-26 [16-01-2024(online)].pdf 2024-01-16
6 202011032976-EVIDENCE FOR REGISTRATION UNDER SSI [31-07-2020(online)].pdf 2020-07-31
7 202011032976-US(14)-HearingNotice-(HearingDate-19-01-2024).pdf 2023-12-19
7 202011032976-DRAWINGS [31-07-2020(online)].pdf 2020-07-31
8 202011032976-DECLARATION OF INVENTORSHIP (FORM 5) [31-07-2020(online)].pdf 2020-07-31
8 202011032976-ABSTRACT [20-02-2023(online)].pdf 2023-02-20
9 202011032976-CLAIMS [20-02-2023(online)].pdf 2023-02-20
9 202011032976-COMPLETE SPECIFICATION [31-07-2020(online)].pdf 2020-07-31
10 202011032976-CORRESPONDENCE [20-02-2023(online)].pdf 2023-02-20
10 202011032976-FORM-26 [30-09-2020(online)].pdf 2020-09-30
11 202011032976-FER_SER_REPLY [20-02-2023(online)].pdf 2023-02-20
11 202011032976-Proof of Right [30-10-2020(online)].pdf 2020-10-30
12 202011032976-FORM 18 [14-03-2022(online)].pdf 2022-03-14
12 202011032976-FORM-26 [20-02-2023(online)].pdf 2023-02-20
13 202011032976-FER.pdf 2022-08-25
14 202011032976-FORM 18 [14-03-2022(online)].pdf 2022-03-14
14 202011032976-FORM-26 [20-02-2023(online)].pdf 2023-02-20
15 202011032976-FER_SER_REPLY [20-02-2023(online)].pdf 2023-02-20
15 202011032976-Proof of Right [30-10-2020(online)].pdf 2020-10-30
16 202011032976-CORRESPONDENCE [20-02-2023(online)].pdf 2023-02-20
16 202011032976-FORM-26 [30-09-2020(online)].pdf 2020-09-30
17 202011032976-COMPLETE SPECIFICATION [31-07-2020(online)].pdf 2020-07-31
17 202011032976-CLAIMS [20-02-2023(online)].pdf 2023-02-20
18 202011032976-ABSTRACT [20-02-2023(online)].pdf 2023-02-20
18 202011032976-DECLARATION OF INVENTORSHIP (FORM 5) [31-07-2020(online)].pdf 2020-07-31
19 202011032976-US(14)-HearingNotice-(HearingDate-19-01-2024).pdf 2023-12-19
19 202011032976-DRAWINGS [31-07-2020(online)].pdf 2020-07-31
20 202011032976-FORM-26 [16-01-2024(online)].pdf 2024-01-16
20 202011032976-EVIDENCE FOR REGISTRATION UNDER SSI [31-07-2020(online)].pdf 2020-07-31
21 202011032976-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [31-07-2020(online)].pdf 2020-07-31
21 202011032976-Correspondence to notify the Controller [16-01-2024(online)].pdf 2024-01-16
22 202011032976-Written submissions and relevant documents [03-02-2024(online)].pdf 2024-02-03
22 202011032976-FORM 1 [31-07-2020(online)].pdf 2020-07-31
23 202011032976-FORM FOR SMALL ENTITY(FORM-28) [31-07-2020(online)].pdf 2020-07-31
23 202011032976-Annexure [03-02-2024(online)].pdf 2024-02-03
24 202011032976-PatentCertificate09-02-2024.pdf 2024-02-09
24 202011032976-FORM FOR STARTUP [31-07-2020(online)].pdf 2020-07-31
25 202011032976-IntimationOfGrant09-02-2024.pdf 2024-02-09
25 202011032976-STATEMENT OF UNDERTAKING (FORM 3) [31-07-2020(online)].pdf 2020-07-31

Search Strategy

1 Search_StrategyAE_29-08-2023.pdf
1 Search_StrategyE_23-08-2022.pdf
2 Search_StrategyAE_29-08-2023.pdf
2 Search_StrategyE_23-08-2022.pdf

ERegister / Renewals