Sign In to Follow Application
View All Documents & Correspondence

Augmented Reality Based Training System

Abstract: An augmented reality based training system comprises a targeting unit embedded with at least one optical marker; emulating headset comprising a set of sensors to sense one or more attributes of the at least one optical marker. A processing unit to determine position of the targeting unit with respect to the emulating headset by selecting a first set of attributes from the sensed one or more attributes, wherein the first set of attributes is associated with position parameters of the targeting unit. Determine orientation of the targeting unit with respect to horizontal plane and vertical plane by selecting a second set of attributes from the sensed one or more attributes, wherein the second set of attributes is associated with orientation parameters of the targeting unit, responsive to triggering of the targeting unit at the determined position and orientation of the targeting unit, determine position of the target that gets hit.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
21 February 2020
Publication Number
10/2020
Publication Type
INA
Invention Field
PHYSICS
Status
Email
info@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2020-12-21
Renewal Date

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector -9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. KAUR, Amanpreet
Chitkara University, Chandigarh Patiala National Highway (NH-64), Village, Jansla, Rajpura, Punjab- 140401, India.
2. MANTRI, Archana
Chitkara University, Chandigarh Patiala National Highway (NH-64), Village, Jansla, Rajpura, Punjab- 140401, India.
3. SINGH, Narinder Pal
Chitkara University, Chandigarh Patiala National Highway (NH-64), Village, Jansla, Rajpura, Punjab- 140401, India.

Specification

TECHNICAL FIELD
[0001] The present disclosure relates generally to an augmented reality based training system. More specifically, it pertains to an augmented reality based training system which provides a simulator environment for providing training to defence personnel.
BACKGROUND OF THE INVENTION
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Training is enhancing knowledge, practical skills, capability, capacity, productivity, and performance of individual or group in a specific area. Moreover, the training required for a profession, trade can continue beyond competence to update and upgrade skills of a particular field. For training various type of colleges, technical institute, and professional schools hire experts from same field and provide training to the interested individual or group. Training in specific areas such as military application, sports, job instruction training, vestibule training, refresher training, apprenticeship training, induction training, martial art training etc. are provided for skill development.
[0004] There is not a single organization on earth that can survive without well-trained members. A training program allows to strengthen skills that each employee needs to improve; the training session brings all employees to a level so they all have similar skills and knowledge in their respective domains.
[0005] Military or Police training is a process which intends to establish and improve the
capabilities of personnel in their respective roles. There are many technical tasks in military
training that are too dangerous, many of which carry life or death risks in real world learning.
Workplace training accidents are, unfortunately, common occurrences in defence, they may
even be fatal or injurious to life. As novices are the most likely to make mistakes, the novices
need safe and secure training methods to practice their skills for gaining confidence.
[0006] Traditional training practices are expensive since lots of resources like guns,
bullets, and targets are required for the training of military people. Many scenarios and real time war like situations are also inconvenient to practice in real life. For example training of troops for situations like world wars, training the troops for Line of Control (LOC)

operations, or the like. Twenty-first century military challenges are both conventional and asymmetric in nature demanding evolution of innovative digital technologies for training. [0007] There is, therefore, a need in the art to provide a simple, convenient and cost effective augmented reality based training system for safe and secure training. [0008] As used in the description herein and throughout the claims that follow, the meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.
[0009] The recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. "such as") provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[0010] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all groups used in the appended claims.
OBJECTS OF THE INVENTION
[0011] A general object of the present disclosure is to provide an efficient and
economical solution for providing training to defence personnel trainees or novice in war like
situation.
[0012] An object of the present disclosure is to provide an augmented reality based
training system for creating digital simulation environment for trainees.
[0013] Another object of the present disclosure is to provide an augmented reality based
training system with enhanced accuracy without any manual intervention.

[0014] Another object of the present disclosure is to provide a simple and cost effective
augmented reality based training system without any requirement of weapons or arms or
armament.
[0015] These and other objects of the present invention will become readily apparent
from the following detailed description taken in conjunction with the accompanying
drawings.
SUMMARY
[0016] Aspects of the present disclosure relate an augmented reality based training
system. More specifically, it pertains to an augmented reality based training system which
provides a simulator environment for providing training to defence personnel.
[0017] In an aspect, the present disclosure provides an augmented reality based training
system, the system can include targeting unit configured to allow a user to aim a target based
on one or more attributes associated with a relative position between the user and the target,
the targeting unit comprises at least one optical marker. An emulating headset configured to
be worn by the user, the emulating headset comprising a set of sensors to sense one or more
attributes of the at least one optical marker.
[0018] In an aspect, a processing unit operatively coupled with the set of sensors, the
processing unit comprising a processor coupled to a memory, the memory storing instructions
executable by the processor to determine position of the targeting unit with respect to the
emulating headset by selecting a first set of attributes from the sensed one or more attributes,
wherein the first set of attributes is associated with position parameters of the targeting unit.
Determine orientation of the targeting unit with respect to horizontal plane and vertical plane
by selecting a second set of attributes from the sensed one or more attributes, wherein the
second set of attributes is associated with orientation parameters of the targeting unit.
[0019] In an aspect, responsive to triggering of the targeting unit at the determined
position and orientation of the targeting unit, determine position of the target that gets hit.
[0020] In an embodiment, the processing unit is configured to allocate a score to the user
based on the determined hit position of the target.
[0021] In an embodiment, the processing unit updates a scoring value based on hit
position of target by the user.
[0022] In an embodiment, the scoring value pertains to score, accuracy, and target.
[0023] In an embodiment, the system can include the emulating headset to display virtual
world and real world simultaneously to the user.

[0024] In an embodiment, the emulating headset is configured with camera for capturing
the optical marker, wherein the optical marker defines position of the targeting unit.
[0025] In an embodiment, the system can include Light Emitting Diode (LED), wherein
LED glows based on connection established between the targeting unit and the processing
unit.
[0026] In an embodiment, the set of sensors can include nine axis sensor, orientation
sensor, and absolute sensor.
[0027] In an embodiment, the targeting unit is any or combination of handheld device,
smartphone, cell phone, virtual gun, gun controller, and motion controller.
[0028] In an embodiment, the emulating headset is configured to emulate any or
combination of augmented reality headset, virtual reality headset, and mixed reality headset.
[0029] Various objects, features, aspects and advantages of the inventive subject matter
will become more apparent from the following detailed description of preferred
embodiments, along with the accompanying drawing figures in which like numerals represent
like components.
BRIEF DESCRIPTION OF THE DRAWINGS
[0030] The accompanying drawings are included to provide a further understanding of
the present disclosure, and are incorporated in and constitute a part of this specification. The
drawings illustrate exemplary embodiments of the present disclosure and, together with the
description, serve to explain the principles of the present disclosure.
[0031] FIG. 1 illustrates a block diagram for implementing an augmented reality based
training system, in accordance with embodiments of the present disclosure.
[0032] FIG. 2 illustrates an exemplary representation of proposed augmented reality
based training system, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0033] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such details as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.

[0034] If the specification states a component or feature "may", "can", "could", or "might" be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0035] Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. This disclosure may however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. These embodiments are provided so that this disclosure will be thorough and complete and will fully convey the scope of the disclosure to those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure). [0036] Various terms as used herein. To the extent a term used in a claim is not defined below, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing. [0037] Aspects of the present disclosure relate an augmented reality based training system. More specifically, it pertains to an augmented reality based training system which provides a simulator environment for providing training to defence personnel. [0038] In an aspect, the present disclosure provides an augmented reality based training system, the system can include targeting unit configured to allow a user to aim a target based on one or more attributes associated with a relative position between the user and the target, the targeting unit comprises at least one optical marker. An emulating headset configured to be worn by the user, the emulating headset comprising a set of sensors to sense one or more attributes of the at least one optical marker.
[0039] In an aspect, a processing unit operatively coupled with the set of sensors, the processing unit comprising a processor coupled to a memory, the memory storing instructions executable by the processor to determine position of the targeting unit with respect to the emulating headset by selecting a first set of attributes from the sensed one or more attributes, wherein the first set of attributes is associated with position parameters of the targeting unit. Determine orientation of the targeting unit with respect to horizontal plane and vertical plane by selecting a second set of attributes from the sensed one or more attributes, wherein the second set of attributes is associated with orientation parameters of the targeting unit.

[0040] In an aspect, responsive to triggering of the targeting unit at the determined
position and orientation of the targeting unit, determine position of the target that gets hit.
[0041] In an embodiment, the processing unit is configured to allocate a score to the user
based on the determined hit position of the target.
[0042] In an embodiment, the processing unit updates a scoring value based on hit
position of target by the user.
[0043] In an embodiment, the scoring value pertains to score, accuracy, robustness and
target.
[0044] In an embodiment, the system can include the emulating headset to display virtual
world and real world simultaneously to the user.
[0045] In an embodiment, the emulating headset is configured with camera for capturing
the optical marker, wherein the optical marker defines position of the targeting unit.
[0046] In an embodiment, the system can include Light Emitting Diode (LED), wherein
LED glows based on connection established between the targeting unit and the processing
unit.
[0047] In an embodiment, the set of sensors can include nine axis sensor, orientation
sensor, and absolute sensor.
[0048] In an embodiment, the targeting unit is any or combination of handheld device,
smart phone, cell phone, virtual gun, gun controller, and motion controller.
[0049] In an embodiment, the emulating headset is configured to emulate any or
combination of augmented reality headset, virtual reality headset, and mixed reality headset.
[0050] FIG. 1 illustrates a block diagram for implementing an augmented reality based
training system 100, in accordance with embodiments of the present disclosure.
[0051] In an embodiment, an augmented reality based training system 100 can be
implemented in shooting training centre, firing training centre, and gun target training
centres. The system can include emulating headset (interchangeably can be referred to as
augmented reality (AR) /virtual reality (VR) headset with computer graphic ray cast, herein)
and targeting unit 106, wherein the AR/VR headset can be worn by user training for shooting
in the shooting training centre. The user with the targeting unit 106 can manipulate the hit
position on target, such that the position and orientation of the targeting unit 106 can be
determined and hit target score can be allocated to user based on the position and orientation
of the targeting unit 106.
[0052] In an embodiment, the augmented reality training system 100 can include
targeting unit 106. The targeting unit 106 can be configured to allow a user (mobile camera)

to aim a target (fiducial marker of gun) based on one or more attributes associated with a relative position between user and target. The targeting unit 106 can include handheld device, smartphone, cell phone, virtual gun, gun controller, motion controller, and the like. The targeting unit 106 can be configured with optical marker 114 (interchangeably can be referred to as fiducial marker 114, herein), said fiducial marker 114 can define position of the targeting unit 106.
[0053] In an embodiment, the augmented reality training system 100 can include augmented reality/virtual reality (AR/VR) headset with computer graphic ray cast can be worn by the user to collaborate between real world and virtual world. The AR headset can be configured with an image capturing device, such as camera 102, to capture fiducial marker 114 which is configured at the targeting unit 106. In another embodiment, the augmented reality/virtual reality (AR/VR) headset with computer graphic ray cast can be used to augment real-world images with computer generated images. In another embodiment, with AR/VR headset user can easily collaborate together between virtual world and real world. [0054] In an embodiment, AR/VR headset can be configured with camera 102 for capturing the fiducial marker 114. The camera 102 can be operatively coupled with the processing unit 108 to determine position 110 of the targeting unit 106 with respect to AR/VR headset by selecting position parameters of the fiducial marker 114 configured at the targeting unit 106.
[0055] In an embodiment, the augmented reality based training system 100 can include sensor 104 configured at the targeting unit 106. The sensor 104 can operatively be coupled with the processing unit 108 to sense orientation of the targeting unit 106, wherein the processing unit 108 to determine orientation 112 of the targeting unit 106 with respect to horizontal plane and vertical plane by selecting orientation parameters of the targeting unit 106. The sensor 104 can include nine axis sensor, orientation sensor, absolute sensor, BNO055, 9 axis orientation sensor, and the like.
[0056] In an embodiment, communication between the sensor 104 and the processing unit 108 may be established via a wired or wireless connection as is known in the art. The processing unit 108may include a single processor unit or multiple processor units in communication with each other. Each processor unit may include, or be communicatively coupled to, memory having computer executable storage instructions. The processor may execute the computer executable storage instructions, causing the processor unit(s) to perform their function.

[0057] In an embodiment, the augmented reality based training system 100 can include memory. The memory may be or employ random access memory (RAM), read-only memory (ROM), optical storage, magnetic storage, removable storage, erasable programmable read only memory and variations thereof, content addressable memory and variations thereof, flash memory, disk drive storage, removable storage, any other memory type feasible in the context of the present invention, any combination thereof, or the like.
[0058] In an embodiment, the processing unit 100 can be responsive to triggering of the targeting unit 106 at the determined position and orientation of the targeting unit 106, the processing unit 100 determine position of the target that gets hit. In another embodiment, the processing unit 100 can be configured to allocate a scoring valuel 16 to the user based on the determined hit position of the target. In yet another embodiment, the processing unit 100 can update a scoring value 116 of the user, wherein the scoring value 116 can be score, target hit, bar, accuracy bar and robustness bar.
[0059] In an embodiment, the targeting unit comprises Light Emitting Diode (LED), wherein LED can glow based on connection established between the targeting unit and the processing unit.
[0060] FIG. 2 illustrates an exemplary representation of proposed augmented reality based training system, in accordance with an embodiment of the present disclosure. [0061] In the exemplary embodiment, an application is built on android operating system which can be configured with the AR/VR headset 202, wherein the AR/VR headset 202 can be configured with the camera. The user 204 can hold targeting unit 206 in hand 212, wherein the user 204 wearing AR/VR headset 202 can view virtual environment as well as real world environment. The targeting unit 206 can be embedded with fiducial marker 208, such that position of the targeting unit 206 can be tracked by using fiducial marker 208 based on vision tracking. The BNO055 nine degree of freedom absolute sensor can be configured at the targeting unit 206, such that rotation of targeting unit 206 can be tracked by the absolute sensor. The user 204 can shoot target 210 with help of push switch positioned at the targeting unit 206, such that the accuracy of target 210 can be determined based on the position and the orientation of the targeting unit 206 with respect to mobile camera (user eye). [0062] In an embodiment, when the user 204 starts to play, main menu of android application will pop up on AR headset screen, wherein the main menu comprises connects option and reject option. The user 204 can focus eyes on the connect option such that Bluetooth, a wireless technology, connection can be established between Arduino and the android application. In android application, the camera can detect the fiducial marker 208 of

the targeting unit 206, such that the position of the targeting unit 206 can be tracked with the fiducial marker 208. The rotation of targeting unit 206 can be tracked by theBNO055 absolute sensor. When the user204 presses the button of the targeting unit 206, a signal will pass to the Arduino such that the Arduino can send signal to draw a ray from the tip of targeting unit 206 to the target 210 The ray can hit at any or combination of inner most circle, middle circle, and outer most circle of the target. If the ray will collide with any of said circle, score parameters such as target hit, accuracy bar and robustness bar can be updated on the application.
[0063] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms "comprises" and "comprising" should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C ... .and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims.
[0064] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable

a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE INVENTION
[0065] The present disclosure provides an efficient and economical solution for
providing training to defence personnel or novice in war like situation.
[0066] The present disclosure provides an augmented reality based training system for
creating digital simulation environment for trainees.
[0067] The present disclosure provides an augmented reality based training system with
enhanced accuracy without any manual intervention.
[0068] The present disclosure provides a simple and cost effective augmented reality
based training system without any requirement of weapons or arms or armament.


We Claim:
1) An augmented reality based training system, said system comprises:
a targeting unit configured to allow a user to aim a target based on one or more attributes associated with a relative position between the user and the target, the targeting unit comprises at least one optical marker;
an emulating headset configured to be worn by the user, the emulating headset comprising a set of sensors to sense one or more attributes of the at least one optical marker;
a processing unit operatively coupled with the set of sensors, the processing unit comprising a processor coupled to a memory, the memory storing instructions executable by the processor to:
determine position of the targeting unit with respect to the emulating headset by selecting a first set of attributes from the sensed one or more attributes, wherein the first set of attributes is associated with position parameters of the targeting unit,
determine orientation of the targeting unit with respect to horizontal plane and vertical plane by selecting a second set of attributes from the sensed one or more attributes, wherein the second set of attributes is associated with orientation parameters of the targeting unit,
responsive to triggering of the targeting unit at the determined position and orientation of the targeting unit, determine position of the target that gets hit.
2) The augmented reality based training system as claimed in claim 1, wherein the
processing unit is configured to allocate a score to the user based on the determined
hit position of the target.
3) The augmented reality based training system as claimed in claim 1, wherein the processing unit updates a scoring value based on hit position of target by the user.
4) The system as claimed in claim 1, wherein the emulating headset is configured with camera for capturing the optical marker, wherein the optical marker defines position of the targeting unit.
5) The augmented reality based training system as claimed in claim 1, wherein the scoring value pertains to score, accuracy, robustness and target.

6) The augmented reality based training system as claimed in claim 1, wherein the
system comprises Light Emitting Diode (LED), wherein LED glows based on
connection established between the targeting unit and the processing unit.
7) The augmented reality based training system as claimed in claim 1, wherein the targeting unit is any or combination of handheld device, smartphone, cell phone, virtual gun, gun controller, and motion controller.
8) The augmented reality based training system as claimed in claim 1, wherein the set of sensors can include nine axis sensor, orientation sensor, and absolute sensor.
9) The augmented reality based training system as claimed in claim 1, wherein the system comprises the emulating headset to display virtual world and real world simultaneously to the user.
10) The augmented reality based training system as claimed in claim 1, wherein the emulating headset is configured to emulate any or combination of augmented reality headset, virtual reality headset, and mixed reality headset.

Documents

Orders

Section Controller Decision Date
15 Raghava Rao Sripathi 2020-11-18
15 Raghava Rao Sripathi 2020-11-18
15 Raghava Rao Sripathi 2020-12-21

Application Documents

# Name Date
1 202011007492-RELEVANT DOCUMENTS [16-08-2022(online)].pdf 2022-08-16
1 202011007492-STATEMENT OF UNDERTAKING (FORM 3) [21-02-2020(online)].pdf 2020-02-21
2 202011007492-FER.pdf 2021-10-18
2 202011007492-FORM FOR STARTUP [21-02-2020(online)].pdf 2020-02-21
3 202011007492-US(14)-HearingNotice-(HearingDate-01-12-2020).pdf 2021-10-18
3 202011007492-FORM FOR SMALL ENTITY(FORM-28) [21-02-2020(online)].pdf 2020-02-21
4 202011007492-US(14)-HearingNotice-(HearingDate-22-10-2020).pdf 2021-10-18
4 202011007492-FORM 1 [21-02-2020(online)].pdf 2020-02-21
5 abstract.jpg 2021-10-18
5 202011007492-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [21-02-2020(online)].pdf 2020-02-21
6 202011007492-IntimationOfGrant21-12-2020.pdf 2020-12-21
6 202011007492-EVIDENCE FOR REGISTRATION UNDER SSI [21-02-2020(online)].pdf 2020-02-21
7 202011007492-PatentCertificate21-12-2020.pdf 2020-12-21
7 202011007492-DRAWINGS [21-02-2020(online)].pdf 2020-02-21
8 202011007492-DECLARATION OF INVENTORSHIP (FORM 5) [21-02-2020(online)].pdf 2020-02-21
8 202011007492-Annexure [16-12-2020(online)].pdf 2020-12-16
9 202011007492-COMPLETE SPECIFICATION [21-02-2020(online)].pdf 2020-02-21
9 202011007492-Written submissions and relevant documents [16-12-2020(online)].pdf 2020-12-16
10 202011007492-Correspondence to notify the Controller [26-11-2020(online)].pdf 2020-11-26
10 202011007492-FORM-9 [25-02-2020(online)].pdf 2020-02-25
11 202011007492-Annexure [06-11-2020(online)].pdf 2020-11-06
11 202011007492-STARTUP [26-02-2020(online)].pdf 2020-02-26
12 202011007492-FORM28 [26-02-2020(online)].pdf 2020-02-26
12 202011007492-Written submissions and relevant documents [06-11-2020(online)].pdf 2020-11-06
13 202011007492-FORM-26 [14-10-2020(online)].pdf 2020-10-14
13 202011007492-FORM-26 [26-02-2020(online)].pdf 2020-02-26
14 202011007492-Correspondence to notify the Controller [13-10-2020(online)].pdf 2020-10-13
14 202011007492-FORM 18A [26-02-2020(online)].pdf 2020-02-26
15 202011007492-ABSTRACT [10-09-2020(online)].pdf 2020-09-10
15 202011007492-Proof of Right [03-03-2020(online)].pdf 2020-03-03
16 202011007492-CLAIMS [10-09-2020(online)].pdf 2020-09-10
16 202011007492-FER_SER_REPLY [10-09-2020(online)].pdf 2020-09-10
17 202011007492-CORRESPONDENCE [10-09-2020(online)].pdf 2020-09-10
18 202011007492-FER_SER_REPLY [10-09-2020(online)].pdf 2020-09-10
18 202011007492-CLAIMS [10-09-2020(online)].pdf 2020-09-10
19 202011007492-ABSTRACT [10-09-2020(online)].pdf 2020-09-10
19 202011007492-Proof of Right [03-03-2020(online)].pdf 2020-03-03
20 202011007492-Correspondence to notify the Controller [13-10-2020(online)].pdf 2020-10-13
20 202011007492-FORM 18A [26-02-2020(online)].pdf 2020-02-26
21 202011007492-FORM-26 [14-10-2020(online)].pdf 2020-10-14
21 202011007492-FORM-26 [26-02-2020(online)].pdf 2020-02-26
22 202011007492-FORM28 [26-02-2020(online)].pdf 2020-02-26
22 202011007492-Written submissions and relevant documents [06-11-2020(online)].pdf 2020-11-06
23 202011007492-Annexure [06-11-2020(online)].pdf 2020-11-06
23 202011007492-STARTUP [26-02-2020(online)].pdf 2020-02-26
24 202011007492-FORM-9 [25-02-2020(online)].pdf 2020-02-25
24 202011007492-Correspondence to notify the Controller [26-11-2020(online)].pdf 2020-11-26
25 202011007492-COMPLETE SPECIFICATION [21-02-2020(online)].pdf 2020-02-21
25 202011007492-Written submissions and relevant documents [16-12-2020(online)].pdf 2020-12-16
26 202011007492-Annexure [16-12-2020(online)].pdf 2020-12-16
26 202011007492-DECLARATION OF INVENTORSHIP (FORM 5) [21-02-2020(online)].pdf 2020-02-21
27 202011007492-DRAWINGS [21-02-2020(online)].pdf 2020-02-21
27 202011007492-PatentCertificate21-12-2020.pdf 2020-12-21
28 202011007492-EVIDENCE FOR REGISTRATION UNDER SSI [21-02-2020(online)].pdf 2020-02-21
28 202011007492-IntimationOfGrant21-12-2020.pdf 2020-12-21
29 202011007492-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [21-02-2020(online)].pdf 2020-02-21
29 abstract.jpg 2021-10-18
30 202011007492-FORM 1 [21-02-2020(online)].pdf 2020-02-21
30 202011007492-US(14)-HearingNotice-(HearingDate-22-10-2020).pdf 2021-10-18
31 202011007492-US(14)-HearingNotice-(HearingDate-01-12-2020).pdf 2021-10-18
31 202011007492-FORM FOR SMALL ENTITY(FORM-28) [21-02-2020(online)].pdf 2020-02-21
32 202011007492-FORM FOR STARTUP [21-02-2020(online)].pdf 2020-02-21
32 202011007492-FER.pdf 2021-10-18
33 202011007492-STATEMENT OF UNDERTAKING (FORM 3) [21-02-2020(online)].pdf 2020-02-21
33 202011007492-RELEVANT DOCUMENTS [16-08-2022(online)].pdf 2022-08-16

Search Strategy

1 SearchStretegy-202011007492E_15-05-2020.pdf

ERegister / Renewals

3rd: 24 Dec 2020

From 21/02/2022 - To 21/02/2023

4th: 24 Dec 2020

From 21/02/2023 - To 21/02/2024

5th: 24 Dec 2020

From 21/02/2024 - To 21/02/2025

6th: 24 Dec 2020

From 21/02/2025 - To 21/02/2026