Abstract: The present disclosure provides a zone simulation apparatus. The disclosed apparatus comprises: An Augmented Reality (AR) headset comprising: a processor operatively coupled with a memory, the memory storing computer-implemented instructions which when executed by the processor enables a user that is wearing the AR headset to observe augmented reality simulation of an area in vicinity of the AR headset; and a set of sensors operatively coupled with the processor, the set of sensors configured to track position and orientation of the AR headset to assist strategically manoeuvring through the area in vicinity of the AR headset.
TECHNICAL FIELD
[001] The present disclosure relates to the field of Augmented Reality (AR) simulation. More particularly, the present disclosure relates to AR based zone simulation apparatus.
BACKGROUND
[002] Soldiers are very essential part of any nation’s security system. During wars and search operations soldiers get injured and many of them get lost and it becomes impossible to locate the injured soldiers and to address their distress calls during intense conditions. Soldiers often face difficulties understanding unexposed terrains and cannot keep track of each other and end up getting separated. During covert missions, soldiers cannot get accurate location of each other without exposing their co-ordinates. Each and every soldier’s life is important because they are the saviour of our country who protects us from enemy attacks, terrorist activities and from many suspicious activities which can harm us as well as our nation too.
[003] AR (Augmented Reality) is called augmented reality, a real-time calculation of the position and angle of the camera image and the image with the appropriate technology, the goal of this technology is on the screen to set the virtual world and the real world interact. AR technology is more life scenes stick closer, great use of space, research, health care, education, design, consumer, entertainment, social networking and so the field can join the virtual image showing a more intuitive visual experience in real-life scenarios.
[004] AR glasses commonly used in conventional bone conduction sound transmitting manner to a user, the bone conduction per se but there is a big limitation, terms such as noise reduction, tone colour is insufficient, the user experience is poor , while a few eight] ^ glasses using ear headphones deliver sound to the user's ear, but some people's ears is inherently not suitable for wearing in-ear headphones, wearing a long time will cause discomfort to the user.
[005] There is a big limitation, in-ear headphones [0004] In view of this, to be modified for the study of existing problems and provide bone Conduction AR glasses appear in the user experience of ways to wear a long time will cause discomfort to the user the issue, aimed at adoption of the technology, solve problems and achieve the purpose of enhancing the value of the utility. 2
[006] There is therefore a need in the art to provide augmented reality-based zone simulation apparatus that overcome the above-mentioned and other limitations of the existing solutions and utilize techniques, which are robust, accurate, fast, efficient, cost effective and simple.
OBJECTS OF THE PRESENT DISCLOSURE
[007] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[008] It is an object of the present disclosure to provide augmented reality-based zone simulation apparatus.
[009] It is another object of the present disclosure to provide augmented reality-based zone simulation apparatus that displays real-time position of the one or more second users nearby and the distance from the first user.
[0010] It is another object of the present disclosure to provide augmented reality-based zone simulation apparatus that enables marking Specific targets, locations or vehicles that can be marked and viewed with the head mounted display only by one or more second users.
[0011] It is another object of the present disclosure to provide augmented reality-based zone simulation apparatus that enables detection of dangerous areas like mine fields or traps can be marked and can be displayed to avoid loss of life. Base camps or army bunkers can be located efficiently.
[0012] It is another object of the present disclosure to provide augmented reality-based zone simulation apparatus that enables a user to send out an alert signal to other soldiers and main control room. Using this, soldier survival rate can be drastically improved during war.
SUMMARY
[0013] The present disclosure relates to the field of Augmented Reality (AR) simulation. More particularly, the present disclosure relates to AR based zone simulation apparatus.
[0014] A zone simulation apparatus is disclosed. The disclosed apparatus can include: An Augmented Reality (AR) headset. The AR headset can include: a processor operatively coupled with a memory, the memory storing computer-implemented instructions which when executed by the processor enables a user that is wearing the AR headset to observe 3
augmented reality simulation of an area in vicinity of the AR headset; anda set of sensors operatively coupled with the processor, the set of sensors configured to track position and orientation of the AR headset to assist strategically manoeuvring through the area in vicinity of the AR headset.
[0015] In an embodiment, the set of sensors comprises a head tracker to track rotation and orientation of head of the user.
[0016] In an embodiment, the apparatus comprise a microphone operatively coupled with the AR headset to enable receiving of audio signals from the user.
[0017] In an embodiment, the apparatus comprises a speaker unit operatively coupled with the AR headset to enable receiving external audio signals and converting the received audio signals to audible voice.
[0018] In an embodiment, the apparatus comprises a GPS device operatively coupled with the processor to enable the processor to monitor real-time position of the AR headset.
[0019] In an embodiment, the processor enables simulation of real-time location of one or more second users, wearing their respective AR headset, in vicinity of a first user wearing the AR headset.
[0020] In an embodiment, the processor configured to simulate using any or a combination of real-time position of the one or more second users wearing their respective AR headset that are in proximity of the first user wearing the AR headset and distance of each of the one or more second users wearing their respective AR headset that are in proximity of the first user wearing the AR headset.
[0021] In an embodiment, the processor configured to generate an alert signal if distance between at least two adjacent AR headset is more than a pre-defined distance.
[0022] In an embodiment, the apparatus comprises an alert unit operatively coupled with the processor to generate any or a combination of an audio and video alert based on the generated alert signal.
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label. 4
[0024] FIGs. 1A and 1B illustrates exemplary representation of AR based zone simulation apparatus in accordance with an embodiment of the present disclosure.
[0025] FIG. 2 illustrates an exemplary block diagram representation of AR based zone simulation apparatus in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0026] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0027] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[0028] Embodiments of the present invention include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, and firmware and/or by human operators.
[0029] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present invention with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present invention may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the invention could be accomplished by modules, routines, subroutines, or subparts of a computer program product.
[0030] If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic. 5
[0031] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0032] Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. These exemplary embodiments are provided only for illustrative purposes and so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those of ordinary skill in the art. The invention disclosed may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Various modifications will be readily apparent to persons skilled in the art. The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Moreover, all statements herein reciting embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure). Also, the terminology and phraseology used is for the purpose of describing exemplary embodiments and should not be considered limiting. Thus, the present invention is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed. For purpose of clarity, details relating to technical material that is known in the technical fields related to the invention have not been described in detail so as not to unnecessarily obscure the present invention.
[0033] Thus, for example, it will be appreciated by those of ordinary skill in the art that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems 6
described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named element.
[0034] Embodiments of the present invention may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The term “machine-readable storage medium” or “computer-readable storage medium” includes, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).A machine-readable medium may include a non-transitory medium in which data may be stored and that does not include carrier waves and/or transitory electronic signals propagating wirelessly or over wired connections. Examples of a non-transitory medium may include, but are not limited to, a magnetic disk or tape, optical storage media such as compact disk (CD) or digital versatile disk (DVD), flash memory, memory or memory devices. A computer-program product may include code and/or machine-executable instructions that may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
[0035] Furthermore, embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks (e.g., a computer-program product) may be stored in a machine-readable medium. A processor(s) may perform the necessary tasks.
[0036] Systems depicted in some of the figures may be provided in various configurations. In some embodiments, the systems may be configured as a distributed system 7
where one or more components of the system are distributed across one or more networks in a cloud computing system.
[0037] Each of the appended claims defines a separate invention, which for infringement purposes is recognized as including equivalents to the various elements or limitations specified in the claims. Depending on the context, all references below to the "invention" may in some cases refer to certain specific embodiments only. In other cases, it will be recognized that references to the "invention" will refer to subject matter recited in one or more, but not necessarily all, of the claims.
[0038] All methods described herein may be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[0039] Various terms as used herein are shown below. To the extent a term used in a claim is not defined below, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing.
[0040] The present disclosure relates to the field of Augmented Reality (AR) simulation. More particularly, the present disclosure relates to AR based zone simulation apparatus.
[0041] A zone simulation apparatus is disclosed. The disclosed apparatus can include: An Augmented Reality (AR) headset. The AR headset can include: a processor operatively coupled with a memory, the memory storing computer-implemented instructions which when executed by the processor enables a user that is wearing the AR headset to observe augmented reality simulation of an area in vicinity of the AR headset; and a set of sensors operatively coupled with the processor, the set of sensors configured to track position and orientation of the AR headset to assist strategically manoeuvring through the area in vicinity of the AR headset.
[0042] In an embodiment, the set of sensors comprises a head tracker to track rotation and orientation of head of the user.
[0043] In an embodiment, the apparatus comprise a microphone operatively coupled with the AR headset to enable receiving of audio signals from the user. 8
[0044] In an embodiment, the apparatus comprises a speaker unit operatively coupled with the AR headset to enable receiving external audio signals and converting the received audio signals to audible voice.
[0045] In an embodiment, the apparatus comprises a GPS device operatively coupled with the processor to enable the processor to monitor real-time position of the AR headset.
[0046] In an embodiment, the processor enables simulation of real-time location of one or more second users, wearing their respective AR headset, in vicinity of a first user wearing the AR headset.
[0047] In an embodiment, the processor configured to simulate using any or a combination of real-time position of the one or more second users wearing their respective AR headset that are in proximity of the first users wearing the AR headset and distance of each of the one or more second user swearing their respective AR headset that are in proximity of the first user wearing the AR headset.
[0048] In an embodiment, the processor configured to generate an alert signal if distance between at least two adjacent AR headset is more than a pre-defined distance.
[0049] In an embodiment, the apparatus comprises an alert unit operatively coupled with the processor to generate any or a combination of an audio and video alert based on the generated alert signal.
[0050] FIGs. 1A and 1B illustrates exemplary representation of AR based zone simulation apparatus in accordance with an embodiment of the present disclosure.
[0051] In an embodiment, Augmented Reality (AR) based simulation apparatus 100 can include an AR headset 102 that can be worn by a user. The apparatus 100 can include a head tracker 104. The head tracker 104 can be coupled with the AR headset 102 to enable tracking position and orientation of head of the user wearing the AR headset 102.
[0052] In an embodiment, the AR headset 102 can include a display unit 106 to display AR simulation to the user. the display unit can include a high brightness video tubes, a semi-transparent LED/LCD screen and the like to enable the user to visually observe AR simulation.
[0053] In an embodiment, the AR headset 102 can include a processor operatively coupled with a memory, the memory storing computer-implemented instructions which when executed by the processor enables the user that is wearing the AR headset 102 to observe augmented reality simulation of an area in vicinity of the AR headset 102.
[0054] In an embodiment, the apparatus 100 comprises a microphone 108 operatively coupled with the AR headset 102 to enable receiving of audio signals from the user. the 9
microphone can be used for audio communication amongst one or more users operatively coupled using wireless/network/internetwork and the like.
[0055] In an embodiment, the apparatus 100 comprises a GPS device (not shown) operatively coupled with the processor to enable the processor to monitor real-time position of the user wearing AR headset 102. Further, the processor can be configured to display real-time location of the one or more user wearing their respective headset to one another. Also, the processor can be configured to calculate the distance of one or more second users wearing their respective AR headset 102 in vicinity of the first user wearing his/her AR headset 102.
[0056] In an embodiment, the apparatus 100 can be configured to display a space stabilized virtual image 110 to the user wearing the headset 102 such that the AR simulation can be observed by the user.
[0057] In an embodiment, the apparatus 100 can include a gesture sensor 112. The gesture sensor can be used to track position and orientation of finger of the user.
[0058] In an embodiment, the zone can include various area and sector where the AR headset 102 can be used for AR simulation.
[0059] In an embodiment, the processor can include one or more processors or controllers. Examples of controllers include, but are not limited to PIC® 16F877A microcontroller, AVR ® ATmega8 & ATmega16, Renesas® microcontroller and the like. Examples of processor can include, but are not limited to, an Intel® Itanium® or Itanium 2 processor(s), or AMD® Opteron® or Athlon MP® processor(s), Motorola® lines of processors, FortiSOC™ system on a chip processors or other future processors.
[0060] It would be appreciated by the person skilled in the art that the apparatus 100 can be used by soldiers for strategically manoeuvring through combat zone without revealing their position to an enemy and further can strategize combat operation effectively.
[0061] FIG. 2 illustrates an exemplary block diagram representation of AR based zone simulation apparatus in accordance with an embodiment of the present disclosure.
[0062] In an exemplary embodiment, first microprocessor 208-1 can be operatively coupled with first Global System for Mobile communication (GSM) 202-1. The GSM 202-1 helps us to provide internet connection by which soldiers can transmit data to main control center 210.
[0063] Further, first Inertial management Unit (IMU) 204-1 can be operatively coupled with the first microprocessor 208-1. The IMU 204-1 measures and reports a body's specific force, angular rate, and sometimes the magnetic field surrounding the body, using a combination of accelerometers and gyroscopes, sometimes also magnetometers. IMUs are 10
typically used to manoeuvre aircraft, including unmanned aerial vehicles (UAVs), among many others, and spacecraft, including satellites and landers.
[0064] Further, first projection mechanism 208 can be operatively coupled with the first microprocessor 208-1 to project the data on the AR headset mounted on the helmet of the soldier. Further, the main control center is the master of all the AR headset of the soldiers where all the data is sent and stored and can be used in case of emergency.
[0065] Thus, it will be appreciated by those of ordinary skill in the art that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named.
[0066] While embodiments of the present invention have been illustrated and described, it will be clear that the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the invention, as described in the claim.
[0067] In the foregoing description, numerous details are set forth. It will be apparent, however, to one of ordinary skill in the art having the benefit of this disclosure, that the present invention may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form, rather than in detail, to avoid obscuring the present invention.
[0068] As used herein, and unless the context dictates otherwise, the term "coupled to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other)and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously. Within the context of this document terms "coupled to" and "coupled with" are also used euphemistically to mean “communicatively coupled with” over a 11
network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.
[0069] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C …. and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc.
[0070] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skillin the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0071] The present disclosure provides augmented reality-based zone simulation apparatus.
[0072] The present disclosure provides augmented reality-based zone simulation apparatus that displays real-time position of the one or more second users nearby and the distance from the first user.
[0073] The present disclosure provides augmented reality-based zone simulation apparatus that enables marking Specific targets, locations or vehicles that can be marked and viewed with the head mounted display only by one or more second users.
[0074] The present disclosure provides augmented reality-based zone simulation apparatus that enables detection of dangerous areas like mine fields or traps can be marked and can be displayed to avoid loss of life. Base camps or army bunkers can be located efficiently. 12
[0075] The present disclosure provides augmented reality-based zone simulation apparatus that enables a user to send out an alert signal to other soldiers and main control room. Using this, soldier survival rate can be drastically improved during war.
We Claim:
1. A zone simulation apparatus, said apparatus comprising:
An Augmented Reality (AR) headset comprising:
a processor operatively coupled with a memory, said memory storing computer-implemented instructions which when executed by said processor enables a user that is wearing the AR headset to observe augmented reality simulation of an area in vicinity of said AR headset; and
a set of sensors operatively coupled with said processor, said set of sensors configured to track position and orientation of said AR headset to assist strategically manoeuvring through the area in vicinity of the AR headset.
2. The apparatus as claimed in claim 1, wherein the set of sensors comprises a head tracker to track rotation and orientation of head of the user.
3. The apparatus as claimed in claim 1, wherein the apparatus comprise a microphone operatively coupled with the AR headset to enable receiving of audio signals from the user.
4. The apparatus as claimed in claim 1, wherein the apparatus comprises a speaker unit operatively coupled with the AR headset to enable receiving external audio signals and converting said received audio signals to audible voice.
5. The apparatus as claimed in claim 1, wherein the apparatus comprises a GPS device operatively coupled with the processor to enable the processor to monitor real-time position of the AR headset.
6. The apparatus as claimed in claim 1, wherein the processor enables simulation of real-time location of one or more second users, wearing their respective AR headset, in vicinity of a first user wearing the AR headset.
7. The apparatus as claimed in claim 6, wherein the processor configured to simulate using any or a combination of real-time position of the one or more second userswearing their respective AR headset that are in proximity of the first user wearing the AR headset and distance of each of the one or more second userswearing their respective AR headset that are in proximity of the first user wearing the AR headset.
8. The apparatus as claimed in claim 1, wherein the processor configured to generate an alert signal if distance between at least two adjacent AR headset is more than a pre-defined distance. 14
9. The apparatus as claimed in claim 8, wherein the apparatus comprises an alert unit operatively coupled with the processor to generate any or a combination of an audio and video alert based on said generated alert signal.
| # | Name | Date |
|---|---|---|
| 1 | 201911005921-Annexure [29-07-2022(online)].pdf | 2022-07-29 |
| 1 | 201911005921-STATEMENT OF UNDERTAKING (FORM 3) [14-02-2019(online)].pdf | 2019-02-14 |
| 2 | 201911005921-FORM FOR STARTUP [14-02-2019(online)].pdf | 2019-02-14 |
| 2 | 201911005921-Written submissions and relevant documents [29-07-2022(online)].pdf | 2022-07-29 |
| 3 | 201911005921-FORM-26 [23-07-2022(online)].pdf | 2022-07-23 |
| 3 | 201911005921-FORM FOR SMALL ENTITY(FORM-28) [14-02-2019(online)].pdf | 2019-02-14 |
| 4 | 201911005921-FORM 1 [14-02-2019(online)].pdf | 2019-02-14 |
| 4 | 201911005921-Correspondence to notify the Controller [22-07-2022(online)].pdf | 2022-07-22 |
| 5 | 201911005921-US(14)-HearingNotice-(HearingDate-25-07-2022).pdf | 2022-06-16 |
| 5 | 201911005921-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [14-02-2019(online)].pdf | 2019-02-14 |
| 6 | 201911005921-EVIDENCE FOR REGISTRATION UNDER SSI [14-02-2019(online)].pdf | 2019-02-14 |
| 6 | 201911005921-ABSTRACT [27-04-2022(online)].pdf | 2022-04-27 |
| 7 | 201911005921-DRAWINGS [14-02-2019(online)].pdf | 2019-02-14 |
| 7 | 201911005921-CLAIMS [27-04-2022(online)].pdf | 2022-04-27 |
| 8 | 201911005921-DECLARATION OF INVENTORSHIP (FORM 5) [14-02-2019(online)].pdf | 2019-02-14 |
| 8 | 201911005921-CORRESPONDENCE [27-04-2022(online)].pdf | 2022-04-27 |
| 9 | 201911005921-COMPLETE SPECIFICATION [14-02-2019(online)].pdf | 2019-02-14 |
| 9 | 201911005921-DRAWING [27-04-2022(online)].pdf | 2022-04-27 |
| 10 | 201911005921-FER_SER_REPLY [27-04-2022(online)].pdf | 2022-04-27 |
| 10 | abstract.jpg | 2019-03-26 |
| 11 | 201911005921-FORM-26 [12-04-2019(online)].pdf | 2019-04-12 |
| 11 | 201911005921-FORM-26 [27-04-2022(online)].pdf | 2022-04-27 |
| 12 | 201911005921-FER.pdf | 2021-12-08 |
| 12 | 201911005921-Power of Attorney-160419.pdf | 2019-04-26 |
| 13 | 201911005921-Correspondence-160419.pdf | 2019-04-26 |
| 13 | 201911005921-FORM 18 [03-11-2020(online)].pdf | 2020-11-03 |
| 14 | 201911005921-Correspondence-070619.pdf | 2019-06-11 |
| 14 | 201911005921-Proof of Right (MANDATORY) [06-06-2019(online)].pdf | 2019-06-06 |
| 15 | 201911005921-OTHERS-070619.pdf | 2019-06-11 |
| 16 | 201911005921-Correspondence-070619.pdf | 2019-06-11 |
| 16 | 201911005921-Proof of Right (MANDATORY) [06-06-2019(online)].pdf | 2019-06-06 |
| 17 | 201911005921-FORM 18 [03-11-2020(online)].pdf | 2020-11-03 |
| 17 | 201911005921-Correspondence-160419.pdf | 2019-04-26 |
| 18 | 201911005921-Power of Attorney-160419.pdf | 2019-04-26 |
| 18 | 201911005921-FER.pdf | 2021-12-08 |
| 19 | 201911005921-FORM-26 [12-04-2019(online)].pdf | 2019-04-12 |
| 19 | 201911005921-FORM-26 [27-04-2022(online)].pdf | 2022-04-27 |
| 20 | 201911005921-FER_SER_REPLY [27-04-2022(online)].pdf | 2022-04-27 |
| 20 | abstract.jpg | 2019-03-26 |
| 21 | 201911005921-COMPLETE SPECIFICATION [14-02-2019(online)].pdf | 2019-02-14 |
| 21 | 201911005921-DRAWING [27-04-2022(online)].pdf | 2022-04-27 |
| 22 | 201911005921-CORRESPONDENCE [27-04-2022(online)].pdf | 2022-04-27 |
| 22 | 201911005921-DECLARATION OF INVENTORSHIP (FORM 5) [14-02-2019(online)].pdf | 2019-02-14 |
| 23 | 201911005921-CLAIMS [27-04-2022(online)].pdf | 2022-04-27 |
| 23 | 201911005921-DRAWINGS [14-02-2019(online)].pdf | 2019-02-14 |
| 24 | 201911005921-ABSTRACT [27-04-2022(online)].pdf | 2022-04-27 |
| 24 | 201911005921-EVIDENCE FOR REGISTRATION UNDER SSI [14-02-2019(online)].pdf | 2019-02-14 |
| 25 | 201911005921-US(14)-HearingNotice-(HearingDate-25-07-2022).pdf | 2022-06-16 |
| 25 | 201911005921-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [14-02-2019(online)].pdf | 2019-02-14 |
| 26 | 201911005921-FORM 1 [14-02-2019(online)].pdf | 2019-02-14 |
| 26 | 201911005921-Correspondence to notify the Controller [22-07-2022(online)].pdf | 2022-07-22 |
| 27 | 201911005921-FORM-26 [23-07-2022(online)].pdf | 2022-07-23 |
| 27 | 201911005921-FORM FOR SMALL ENTITY(FORM-28) [14-02-2019(online)].pdf | 2019-02-14 |
| 28 | 201911005921-Written submissions and relevant documents [29-07-2022(online)].pdf | 2022-07-29 |
| 28 | 201911005921-FORM FOR STARTUP [14-02-2019(online)].pdf | 2019-02-14 |
| 29 | 201911005921-STATEMENT OF UNDERTAKING (FORM 3) [14-02-2019(online)].pdf | 2019-02-14 |
| 29 | 201911005921-Annexure [29-07-2022(online)].pdf | 2022-07-29 |
| 1 | SearchStrategyE_29-11-2021.pdf |