Abstract: The hand gestures of the user are captured using the gesture controller device, connected to a computer and upon getting recognized by the device drivers, the virtual scene and objects are displayed on the virtual reality (VR) platform. Depending on the options selected on the virtual scenario, the context is displayed on the VR platform along with the objects picked from the object database. The rules engine sub system enables the picking and display of the right objects on to the VR platform based on the option selected. The user selected options are displayed, recognized and routed to the rules engine sub system by the iReU engine sub system which acts as the content engine. When an authorized user - device - license combination is granted access for use by the license manager, the user is ^ble to perform operations (sequence of steps, as in an experiment) virtually using gesture actions.
[0001] The present invention generally relates to a gesture based virtual reality learning system and more particularly, relates to hand gesture based immersive and interactive virtual reality learning environment.
BACKGROUND:
[0002] Virtual environments are increasingly gaining the interest of users in numerous fields including the field of learning. Learning by doing practical exercises/experiments has always been the norm to understand the practical effect of any concept or idea. Performing experiments is the widely used methodology in the process of learning and understanding science. Be it the natural sciences or the applied sciences like medical science or aerospace, this mode of learning is considered very effective. Traditionally experiments/exercises were conducted or performed at learning centres to impart practical knowledge to the seekers. However, due to variety of reasons viz. cost factor, availability and space constraints among others, the knowledge seekers weren't benefitted due to non-immersive and non-interactive teaching methods. Moreover, large number of knowledge seekers were made to watch how a particular exercise is performed instead of they themselves performing the same. Also, performing experiments can be very expensive and hazardous at times. Some experiments also pose health risks and sustainability considerations warranting regulated waste disposal. Trends in the learning space have identified experiential learning to provide huge impetus to the learning outcomes of knowledge seekers.
[0003] While there is huge cost factor involved in the capital expenditure incurred in setting up the experiment facilities and equipments, there is also considerable running costs due to maintenance of equipments, replacement of breakages and replenishment of consumables. Performing experiments repetitively for better understanding increases the disadvantage proportionately under the categories listed above. Above all, the ability of these physical setup to reach the learner is of considerable importance when it comes to geographically remote and economically unaffordable communities.
[0004] The solution to these challenges is rendered by the use of augmented reality, virtual reality technologies in learning process.
[0005] Virtual laboratories provide an immersive interactive experience in performing experiments and understanding concepts. In three-dimensional virtual environments, the users have access to and interact with three-dimensional virtual objects, mostly with the help of enabling tools viz. VR headsets, VR glasses, Hololens, Leap Motion products, gloves, tracking sensors/devices. In reality, users access and interact with real objects without the support of tools as mentioned above.
[0006] Although there are interactive virtual reality learning environments in the prior art, none of them are able to provide the required interactive interface, non-wearable gesture recognition, experiential learning and choice of content.
[0007] Accordingly, there is a need for an interactive virtual reality learning environment adapted to conduct learning without the use of any assistive tools i.e. through use of gestures and wide choice of content adapted to various relevant scenarios.
SUMMARY:
[0008] To address these drawbacks in the conventional Teaming through experiments' process and also extending the domain of experiments to training and rehearsals, the embodiments of the present invention completely circumvents the disadvantages of physical objects and physical environment and adopts an interactive virtual reality system that works on the concept of dematerialized experiential learning.
[0009] Embodiments of the present invention provide an interactive virtual reality system which allows to conduct learning operations using hand gestures recognized by a gesture controller device, without the use of any wearables. The interactive virtual reality system engine is capable of providing the method and content for the options in respect of various operations performed during the conduct of the experiments selected by the user from a wide range of options made available. While the user performs the operations for learning experiments, the virtual scene as well as the experiments conducted virtually on the Virtual Reality (VR) platform can be witnessed by a wider audience or an individual depending on the display devices connected to the computer.
[0010] In a first embodiment, a method for performing learning operations in an interactive virtual reality system (iReU, Interactive Reality Enabled Universe) is described , the method comprising, validating the credentials of the user upon the user logging into the iReU system, providing access to a Virtual Reality (VR) platform upon successful validation of the user, providing, to the user, various options of learning operations available with the VR platform, selecting, by the user, the option to a specific learning operation, displaying, on the VR platform, the relevant objects of the chosen learning operation, performing, by the user, hand based gestures in focus of the gesture controller device, for handling the objects required for performing the learning operation, providing, to the user, information about the performed learning operation.
[0011] In a second embodiment, an interactive virtual reality system (iReU) for performing learning operations is described, the system comprising: a gesture controller device, a computing system, configured to, validate the credentials of the user upon the user logging into the iReU system, provide access to a Virtual Reality (VR) platform upon successful validation of the user,
provide, to the user, various options of learning operations available with the VR platform, select, by the user, the option to a specific learning operation, display, on the VR platform, the relevant objects of the chosen learning operation, perform, by the user, hand based gestures in focus of the gesture controller device, for handling the objects required for performing the learning operation, provide, to the user, information about the performed learning operation.
[0012] The above and other preferred features, including various novel details of implementation and combination of elements will now be more particularly described with reference to the accompanying drawings and pointed out in the claims. It will be understood that the particular methods and apparatus are shown by way of illustration only and not as limitations/As will be understood by those skilled in the art, the principles and features explained herein may be employed in various and numerous embodiments.
BRIEF DESCRIPTION OF THE DRAWINGS:
[0013] The accompanying drawings, which are included as part of the present specification, illustrate the presently preferred embodiment of the present invention and together with the general description given above and the detailed description of the preferred embodiment given below to explain and teach the principles of the present invention.
[0014] FIG. 1 is a flow diagram of a processing system that is configured to capture the hand gestures of the user and allow the user to interact with the interactive virtual reality system to perform certain learning operations in a virtual scene in an interactive manner.
[0015] FIG. 2 shows a schematic block diagram of an interactive virtual reality system for performing certain learning operations in a virtual scene in an interactive manner.
[0016] FIG. 3 shows a perspective schematic representation of an embodiment of a user using the interactive virtual reality system from FIG. 2
[0017] FIG. 4 shows a perspective view of an example of a virtual scene, which is used to perform certain interactive learning operations according to one of the Figs. 1 to 2, during a certain learning situation.
[0018] FIG. 5, FIG. 6 & FIG. 7 shows three perspective views of an example of a set of objects that are used for the chosen interactive virtual learning operation and the user performing certain specific actions like holding and pinching, using hand gestures, according to some embodiments.
[0019] FIG. 8 & FIG. 9 shows two perspective views of an example of a virtual scene that are used for the chosen interactive virtual learning operation using hand gestures of the user performing certain specific navigating actions according to some embodiments.
[0020] To foregoing and other objects, features and advantages of the invention will be apparent from the following detailed description in conjunction with the drawings described hereinafter. It is to be appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting in its scope.
DETAILED DESCRIPTION:
[0021] The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The description sets forth the functions of the example and the sequence of steps for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
[0022] The present invention provides for an ^interactive virtual reality system to conduct learning operations virtually using hand gestures recognized by a gesture controller device, without the use of any wearable.
[0023] To this end, the present invention provides a user with the experience of performing learning operations as close to physical world as possible while an iReU engine provides the method and content for the options and conduct of the said experiments. The dematerialized experiential learning system provides a platform to perform experiments that are pre-configured, pre-contextualized, repeatable and devoid of any physical experimental objects by using interactive virtual reality. There is no device attached to the human body while performing the operations for experiments. It is intended for use by a wide spectrum of learners irrespective of their age, affiliation to any institution or organization or geography. It de-materializes hazardous situations and environments, protects valuable human life and provides a high precision access to experimental objects, such as in health care applications. It provides the platform for learning by observation as in the case of an instructor led experiment. When physical environment actions are emulated, the system need not encompass all of them into the virtual environment. For example, the system also allows automation of non value added sequences while performing the operations during an experiment, as in the case of showing the filled container instead of showing the liquid flowing into the container while it is being poured into the container. Thereby, the system not only automates the non value add actions, but also reduces the learning time, making learning more effective.
[0024] FIG. 1 is a flowchart illustrating an exemplary embodiment of a method for conducting learning operations virtually in an interactive virtual reality system that is configured to infer the hand gestures of a user 100 and allow the user 100 to interact with the interactive virtual reality
system, iReU system 99 to perform certain learning operations in a virtual scene in an interactive manner.
[0025] With reference to fig. 1, with the user logging in enabled, at step 10, the user 100 logs in into the iReU system 99 to gain access. Based on the validity of the license and the gesture controller device 200 - computer 300 combination which has been discussed in the description below, the user 100 gains access to the VR platform 500at step 20. Depending on the nature of license the user 100 has procured, the user 100 will be provided with various options of operations possible with the VR platform 500 for learning operations. At step 25, the user 100 will be able to make the choice of the application he/she wants to work and navigates through the options to a specific operation for learning. Based on the options selected by the user, at step 30, the iReU engine 800 interacts with the rules engine 700 to fetch the objects relevant for the experiment selected. The rules engine will fetch the right objects from the objects database 600 and return the same to iReU engine 800 for displaying on the VR platform 500. At step 35, the User 100 is allowed to perform hand gestures who starts handling the objects required for conducting the operation virtually very similar to the way he/she would perform an experiment in the physical world. Depending on the user gestures and steps followed while conducting the operation, at step 40, the iReU engine 800 responds to user interactions working with the rules engine 700. At step 45, the rules engine 700 in turn coordinates with exception handler 1000, analytics engine 1100 and learning performance management 1200 modules based on user interactions to provide inputs to iReU engine 800 for display on the VR platform 500. While the user 100 performs the operations for learning, the virtual scene as well as the experiments conducted virtually on the VR platform 500 can be witnessed by audience 1400 depending on the display devices 1300 connected to the computer 300.
[0026] FIG. 2 shows a schematic block diagram illustrating an exemplary embodiment of the interactive virtual reality system 99 for performing learning operations in a virtual scene in an interactive manner. The iReU system 99 typically encompasses the following components. The core of the iReU system 99 is the iReU engine 800 which provides the method and content for the options in respect of various operations using the interactive virtual reality system and the conduct of the experiments selected by the user 100. The iReU engine 800 coordinates with all the components of the iReU system 99 to provide the user 100 with the virtual experiential learning of experiments, as close to physical world experience as possible. The gesture controller device 200 is a plug and play device connected to the computer 300. When the gesture controller device 200 is first time connected to the computer 300, during the installation process, it would install relevant device drivers 400 on the computer 300 it is connected to. Gesture controller device 200 is a sensor based device, packaged along with the iReU system 99 and can recognize the hand gestures of the user 100 and communicate the same to the VR platform 500 through the device drivers 400.
[0027] To use the iReU system 99, the first activity the user 100 needs to perform will be to login into the iReU system 99. As the user 100 provides the login credentials, the license manager 900 validates the same. The license manager 900 contains the most updated details of all users and their license profiles. The license manager 900 will ensure the validity of the license in terms of the duration as well as the type of experiments the user 100 has access to, in addition the gesture controller device 200 - computer 300 combination for which the license has been issued will all be matched and upon successful outcome from validation, the user 100 gains access to the VR platform 500. In case of any mismatch, the user 100 will not be allowed to log in and appropriate messages will be posted to the user 100 for further action.. Upon successful outcome from validation, the user 100 gains access to the VR platform 500.
[0028] Depending on the nature of license the user 100 has procured, access will be provided to the user 100 to menu items of experiments possible to be performed. Depending on the nature of the license, be it a trial license or a license for learning specific experiments in an individual licensed mode or in an organizational licensed mode allowing the user 100 to perform the experiments in a mode that the user 100 wants to perform, be it for learning or practicing or evaluating the understanding of the user 100, the user 100 will be provided with various options of operations possible with the VR platform 500 for learning experiments. The user 100 will be able to make the choice of the application he/she wants to work and navigates through the options to a specific experiment for learning. The License manager 900, the rules engine 700 and the VR platform 500 are tightly integrated through the iReU engine 800 and hence this selection of application and the experiment goes smooth once the user 100 logs in.
[0029] Once the specific experiment is selected by the user 100, the iReU engine 800 works with the rules engine 700 to get the relevant objects displayed on the VR platform 500. The rules engine fetches the objects required to perform the operation, including picking up any additional objects that may be involved in the experiment which may be required during the subsequent steps depending upon the conduct of the experiment from the objects database 600 and interacts with the iReU engine to return the objects. The objects database 600 is populated with all relevant virtual objects such as test tubes, beakers, flasks, pH chart etc as in case of a chemistry experiment and with accessories and equipments like prism, pendulum, light source, heat source, sonometer etc as in the case of a physics experiment. In an aspect, the objects database 600 will also contain virtual objects such as heart, liver, kidney, flowers, roots etc in whole and dissected condition relevant to understanding and performing biological experiments. In another aspect, the objects database 600 will also contain virtual objects such as jet engine, fuselage, wings, rudder etc. in whole (assembled) and in part (sub-assemblies & un-assembled) condition relevant to understanding and performing aircraft maintenance. The objects database 600 will be a complete and expanding repository of all virtual objects required for any experiment that can be performed virtually using the iReU system 99. All the objects required for all the experiments possible with the iReU system are made available and continuously updated in the objects database 600 with corresponding rules built into the rules engine 700. The user 100 performs
hand gestures in the focus of the gesture controller device 200 while handling the objects required for conducting the experiment virtually very similar to the way he/she would perform an experiment in the physical world. This involves capturing of the user 100 hand gestures by the gesture control device 200 and conveying it from the VR platform 500 to the iReU engine 800 to progress with the subsequent steps towards progressing with the experiment and completing the experiment depending on the mode selected by user 100.
[0030] The rules engine 700 is a repository of all the rules required for the conduct of the experiment. It coordinates between the iReU engine 800 and all other components, namely objects database 600, license manager 900, exception handler 1000, analytics engine 1100 and learner performance manager 1200 - required either for directly performing the experiments or to gather additional information based on the performance of the experiment for processing and future use.
[0031] According to certain aspects, the exception handler 1000 works with the iReU engine 800 through rules engine 700 and serves as the repository and hub in handling any exception that arises while the user 100 performs an experiment. It provides appropriate messages to the user 100 depending on the user's hand gestures to enable users take any corrective action to progress with the operations of the experiment. This involves continuous interaction between iReU Engine 800 and the exception handler 1000 for trapping the non-confirming activities by the VR platform 500 and subsequently by the iReU Engine 800 and relaying them to the exception handler 1000 and getting appropriate messages displayed back on the VR platform 500 in real time. Exception handler 1000 also captures any unforeseen errors encountered while performing the experiments. It is responsible for providing the right message that should get displayed on the VR platform 500 when the user 100 commits an error while performing the experiments or in case of an unforeseen error. The iReU engine 800 validates the actions performed by user 100 against the rules defined in the rules engine 700 and raises exception when rules are violated. This exception is captured by the exception handler 1000 and feedback provided to the user 100. This involves continuously monitoring and responding in real time to the user 100 hand gestures captured by the gesture controller device 200 through the VR platform 500 and subsequently transmitted to iReU engine 800 so that iReU engine 800 can work with rules engine 700 to provide complete virtual learning experience to the user 100 in performing and completing the experiment.
[0032] According to certain aspects,the analytics engine 1100 captures all details of the user 100 while he/she performs the experiments and will be able to analyze and provide inputs and suggestions real time as well as based on the history of the user 100 in performing experiments. In another aspect of the invnetion, apart from the user perspective, the analytics engine 1100 also has the ability to provide inputs based on the user 100 and organizational or institutional usage of this invention. The analytics engine 1100 provides the user 100 with insights based on his/her performance during the past virtual learning sessions and suggests to try additional experiments
to make the learning more oriented towards a specific established learning objective. The analytics engine 1100 allows the organization to benefit as a group learning entity based on the past learning data and patterns obtained from individual learner usage and performance as well as insights obtained through user behavior during the conduct of virtual experiments in the past. It provides the ability to slice and dice learner usage data for organizational decision making. Effectively, through the user 100 (individual or organization) interaction with the interactive virtual reality system, the user 100 gets a seamless virtual learning experience with benefits beyond the ability of the physical learning environment and a physical human instructor, exchanging learning data and information for the benefit of the user 100 (individual or organization or enterprise).
[0033] According to certain aspects, the learner performance manager 1200 captures, stores and provides all data pertaining to the mode and conduct of the experiment that a user 100 performs. This will also serve as the data repository for preparing reports out of the iReU system 99 for the user 100 as well as the organization. The learner performance manager 1200 provides the user 100 with information (dashboard) on his/her performance during the past virtual learning sessions to make the learning more useful and also to ensure the coverage of all experiments. It also provides information about the modes taken up by the user 100 for each of the experiments available for learning as well as user's evaluation outcome from the experiments, arrived from the history information stored. This makes the user interaction with the interactive virtual reality system a seamless virtual learning experience simulating the physical learning environment as if the user 100 is assisted by physical human instructor providing feedback on the user's performance about the past learning sessions thereby enlarging the scope of learning.
[0034] FIG. 3 shows a perspective schematic representation of an embodiment of a user 100 using the iReU system 99. In this specific representation, the user 100 is seen navigating through the options of various operations possible in the iReU system 99, after logging in. The illustration also shows the gesture controller device 200 connected to the computer 300.
[0035] FIG. 4 shows a perspective view of an example of a virtual scene on the VR platform 500, which is used to perform certain interactive learning operations. In this specific representation, the virtual scene illustrated is that of a setup of a chemistry laboratory environment. This virtual scene has been brought in by the iReU engine 800 through the rules engine 700 and objects database 600 based on the options chosen by the user 100 to perform certain chemistry based experiment.
[0036] FIG. 5 shows a perspective view of the VR platform 500 with an example of a virtual scene containing a set of test tubes that are used for the chosen interactive virtual learning operation, a specific experiment on chemistry, subsequent to the chemistry laboratory environment that was illustrated in FIG. 4. The user 100 is seen performing certain specific action like stretching his/her hand to approach the test tubes in order to hold the test tube
virtually for performing an experiment and the VR platform 500 displaying the virtual hand simulating the user 100 action. In the same scenario, FIG. 6 depicts an example of the virtual hand getting even closer to holding the test tube, just before the virtual object is grabbed. Similarly, in the same chemistry laboratory scenario, FIG. 7 shows an example of a specific action of the user 100, holding and effecting a filler to pour certain liquid into one of the test tubes. In all these illustrations, it can be seen that the virtual hand simulates the physical hand gestures and carries out the intended operation virtually while conducting the experiments.
[0037] FIG. 8 shows a perspective view of an example of a virtual scene of performing chemistry experiments on the VR platform 500, using the iReU system 99, again an extension of the examples illustrated in the figures, FIG. 4 through FIG. 7. Here it can be seen that the user 100 is using certain specific hand gestures to navigate the virtual scene. The user 100 can be seen using his/her left index finger to zoom out the virtual scene, in this example, that of a chemistry laboratory, after performing the experiments as described in figures, FIG. 5 through FIG. 7. Similarly, in FIG. 9, the user 100 can be seen using his /her inverted left hand with thumb pointing to his/her left indicating panning action for the example virtual scene, a chemistry laboratory. This way, the user 100 can move around the focus of the virtual scene to reach objects in various places of the virtual chemistry laboratory, simulating the physical world operation. These navigating actions can be configured to specific hand gestures pre-set by the user 100 in the iReU engine 800 through the VR platform 500.
[0038] Note that not all of the activities or elements described above in the general description are required, that a portion of a specific activity or device may not be required, and that one or more further activities may be performed, or elements included, in addition to those described. Still further, the order in which activities are listed are not necessarily the order in which they are performed. Also, the concepts have been described with reference to specific embodiments. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present disclosure as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of the present disclosure.
[0039] The steps of a method or algorithm described in connection with the present disclosure may be embodied directly in hardware, in a software module executed by a processor, firmware or any combination thereof. A software module may reside in any form of storage medium that is known in the art. Some examples of storage media that may be used include random access memory (RAM), read only memory (ROM), flash memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM and so forth. A storage medium may be coupled to a processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor.
[0040] The methods disclosed herein comprise one or more steps or actions for achieving the described method. The method steps and/or actions may be interchanged with one another without departing from the scope of the claims. In other words, unless a specific order of steps or actions is specified, the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims.
[0041] It is to be understood that the claims are not limited to the precise configuration and components illustrated above. Various modifications, changes and variations may be made in the arrangement, operation and details of the methods and devices described above without departing from the scope of the claims, as defined below.
A method for performing learning operations in an interactive virtual reality system (iReU) the method comprising:
validating the credentials of a user upon the user logging into the iReU system;
providing access to a Virtual Reality (VR) platform upon successful validation of the user;
providing, to the user, various options of learning operations available with the VR
platform,
selecting, by the user, the option to a specific learning operation;
displaying, on the VR platform, the relevant objects of the chosen learning operation;
performing, by the user, hand based gestures in focus of the gesture controller device, for
handling the objects required for performing the learning operation;
providing, to the user, information about the performed learning operation.
The method as claimed in claim 1, wherein the validating the credentials of the user comprises, matching the validity of the license in terms of the duration as well as the type of application the user has access to, and the gesture controller device identity - computer identity combination for which the license has been issued.
The method as claimed in claim 1, wherein gaining access by the user to the VR platform upon successful outcome from validation.
The method as claimed in claim 1, wherein the providing the user with various options of learning operations available with the VR platform for learning depending on the nature of license the user has procured.
The method as claimed in claim 1, comprising selecting the options of learning operations by the user.
The method as claimed in claim 1, wherein enabling, based on the user selected option, interaction of the interactive virtual reality engine (iReUE) with a rules engine (RE) to fetch the objects relevant for the learning operation selected.
The method as claimed in claim 1 /comprising: picking the right objects from an Objects Database by the rules engine and returning the same to the iReU engine (iReUE) for displaying on the VR platform.
The method as claimed in claim 1, comprising: providing appropriate messages to the user depending on the user's hand gestures to enable users take any corrective action to progress with the operations of the learning.
The method as claimed in claim 1, wherein interacting between the iReU Engine and an Exception Handler for trapping the non-confirming activities by the VR platform and subsequently by the iReU Engine and relaying non-confirming activities to the Exception Handler and getting appropriate messages displayed back on the VR platform in real time.
The method as claimed in claim 9, wherein the exception handler serves as the repository and hub in handling any exception providing appropriate messages to the user depending on the user's hand gestures to enable users take any corrective action to progress with the operations of the learning operation which involves continuous interaction between iReU Engine and the exception handler for trapping the non-confirming activities by the VR platform and subsequently by the iReU Engine and relaying them to the exception handler and getting appropriate messages displayed back on the VR platform in real time.
The method as claimed in claim 1, wherein responding in real time to the user hand gestures captured by the gesture controller device through the VR platform.
An interactive virtual reality system (iReU) for performing learning operations, the system
comprising:
a gesture controller device;
a computing system, configured to;
validate the credentials of a user upon the user logging into the iReU system;
provide access to a Virtual Reality (VR) platform upon successful validation of the user;
provide, to the user, various options of learning operations available with the VR platform,
select, by the user, the option to a specific learning operation;
display, on the VR platform, the relevant objects of the chosen learning operation;
perform, by the user, hand based gestures in focus of the gesture controller device, for
handling the objects required for performing the learning operation;
provide, to the user, information about the performed learning operation.
| # | Name | Date |
|---|---|---|
| 1 | 201941021093-STATEMENT OF UNDERTAKING (FORM 3) [28-05-2019(online)].pdf | 2019-05-28 |
| 1 | 201941021093-US(14)-HearingNotice-(HearingDate-11-12-2020).pdf | 2021-10-17 |
| 2 | 201941021093-FORM FOR SMALL ENTITY(FORM-28) [28-05-2019(online)].pdf | 2019-05-28 |
| 2 | 201941021093-PETITION UNDER RULE 137 [24-12-2020(online)].pdf | 2020-12-24 |
| 3 | 201941021093-Written submissions and relevant documents [24-12-2020(online)].pdf | 2020-12-24 |
| 3 | 201941021093-FORM 1 [28-05-2019(online)].pdf | 2019-05-28 |
| 4 | 201941021093-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-05-2019(online)].pdf | 2019-05-28 |
| 4 | 201941021093-Correspondence to notify the Controller [08-12-2020(online)].pdf | 2020-12-08 |
| 5 | 201941021093-DRAWINGS [28-05-2019(online)].pdf | 2019-05-28 |
| 5 | 201941021093-2. Marked Copy under Rule 14(2) [03-06-2020(online)].pdf | 2020-06-03 |
| 6 | 201941021093-Retyped Pages under Rule 14(1) [03-06-2020(online)].pdf | 2020-06-03 |
| 6 | 201941021093-DECLARATION OF INVENTORSHIP (FORM 5) [28-05-2019(online)].pdf | 2019-05-28 |
| 7 | 201941021093-COMPLETE SPECIFICATION [28-05-2019(online)].pdf | 2019-05-28 |
| 7 | 201941021093-ABSTRACT [14-05-2020(online)].pdf | 2020-05-14 |
| 8 | 201941021093-Proof of Right (MANDATORY) [20-06-2019(online)].pdf | 2019-06-20 |
| 8 | 201941021093-CLAIMS [14-05-2020(online)].pdf | 2020-05-14 |
| 9 | 201941021093-COMPLETE SPECIFICATION [14-05-2020(online)].pdf | 2020-05-14 |
| 9 | 201941021093-FORM-26 [20-06-2019(online)].pdf | 2019-06-20 |
| 10 | 201941021093-CORRESPONDENCE [14-05-2020(online)].pdf | 2020-05-14 |
| 10 | Correspondence by Agent_Power of Attorney_21-06-2019.pdf | 2019-06-21 |
| 11 | 201941021093-DRAWING [14-05-2020(online)].pdf | 2020-05-14 |
| 11 | Correspondence by Agent_Form-1_21-06-2019.pdf | 2019-06-21 |
| 12 | 201941021093-FER_SER_REPLY [14-05-2020(online)].pdf | 2020-05-14 |
| 12 | 201941021093-STARTUP [09-10-2019(online)].pdf | 2019-10-09 |
| 13 | 201941021093-FORM28 [09-10-2019(online)].pdf | 2019-10-09 |
| 13 | 201941021093-OTHERS [14-05-2020(online)].pdf | 2020-05-14 |
| 14 | 201941021093-FER.pdf | 2019-12-03 |
| 14 | 201941021093-FORM-9 [09-10-2019(online)].pdf | 2019-10-09 |
| 15 | 201941021093-FORM 18A [09-10-2019(online)].pdf | 2019-10-09 |
| 16 | 201941021093-FER.pdf | 2019-12-03 |
| 16 | 201941021093-FORM-9 [09-10-2019(online)].pdf | 2019-10-09 |
| 17 | 201941021093-OTHERS [14-05-2020(online)].pdf | 2020-05-14 |
| 17 | 201941021093-FORM28 [09-10-2019(online)].pdf | 2019-10-09 |
| 18 | 201941021093-FER_SER_REPLY [14-05-2020(online)].pdf | 2020-05-14 |
| 18 | 201941021093-STARTUP [09-10-2019(online)].pdf | 2019-10-09 |
| 19 | Correspondence by Agent_Form-1_21-06-2019.pdf | 2019-06-21 |
| 19 | 201941021093-DRAWING [14-05-2020(online)].pdf | 2020-05-14 |
| 20 | Correspondence by Agent_Power of Attorney_21-06-2019.pdf | 2019-06-21 |
| 20 | 201941021093-CORRESPONDENCE [14-05-2020(online)].pdf | 2020-05-14 |
| 21 | 201941021093-FORM-26 [20-06-2019(online)].pdf | 2019-06-20 |
| 21 | 201941021093-COMPLETE SPECIFICATION [14-05-2020(online)].pdf | 2020-05-14 |
| 22 | 201941021093-CLAIMS [14-05-2020(online)].pdf | 2020-05-14 |
| 23 | 201941021093-ABSTRACT [14-05-2020(online)].pdf | 2020-05-14 |
| 24 | 201941021093-Retyped Pages under Rule 14(1) [03-06-2020(online)].pdf | 2020-06-03 |
| 25 | 201941021093-2. Marked Copy under Rule 14(2) [03-06-2020(online)].pdf | 2020-06-03 |
| 26 | 201941021093-Correspondence to notify the Controller [08-12-2020(online)].pdf | 2020-12-08 |
| 27 | 201941021093-Written submissions and relevant documents [24-12-2020(online)].pdf | 2020-12-24 |
| 28 | 201941021093-PETITION UNDER RULE 137 [24-12-2020(online)].pdf | 2020-12-24 |
| 29 | 201941021093-US(14)-HearingNotice-(HearingDate-11-12-2020).pdf | 2021-10-17 |
| 1 | 201941021093_search_03-12-2019.pdf |