Abstract: The present disclosure pertains to a system and method for an augmented reality assisted learning. The system (100) includes an image recognition unit (102), an augmented reality (AR) engine 104, and a display unit (106). The image recognition unit (102) is configured to scan one or more circuits, and correspondingly generate a first set of signals. The AR engine (104) is configured to extract a second set of signals, where the second set of signals pertain to scanned one or more components (108) of the one or more circuits, identify the scanned one or more components 108 from at least one of the one or more circuits, match the one or more components (108) with a dataset and correspondingly generate a third set of signals. The display unit (106) is configured to display an AR view of function of the one or more components based on the third set of signals.
[0001] The present disclosure relates generally to field of digital system. More particularly, the present disclosure provides a system and method for augmented reality (AR) assisted learning.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Development in stand-alone product development boards can be expeditious. Such rate of development causes a challenge for learners to understand basic functionality of different components embeds on the development boards. Further, internal operations occurring inside the integrated circuit is difficult to visualize by the learners and becomes challenging for the learners to understand concepts of the internal operations. Hence, there is a need of a system which can solve these issues and make learning more interactive and useful.
[0004] Existing solutions can include Simulators that can work upon Arduino platform. However, platform with capability of visualization and real-time interfacing of simulators with real-hardware is absent. Another solution can include integrated development environment (IDE). However, the IDE’s have the capability to explain role and function of different components embed in development boards along with certain limitations
[0005] There is a need to overcome above mentioned problem of prior art by bringing a solution that enables visualizing internal operations and functions of the different components inside the development board. Learners can interact with real-time hardware using the solution and can improve the learning knowledge.
OBJECTS OF THE PRESENT DISCLOSURE
[0006] Some of the objects of the present disclosure, which at least one
embodiment herein satisfies are as listed herein below.
[0007] It is an object of the present disclosure to provide a system and a method that facilitates increasing self- learning capability of students and other interested entity.
[0008] It is an object of the present disclosure to provide a system that helps in giving an interactive environment for learning.
[0009] It is an object of the present disclosure to provide system with a three dimensional generated visualization of different components like resistor, sensor, light emitting diode and the like embedded on development board.
[0010] It is an object of the present disclosure to provide a system that enables self-assessment for students using quick knowledge check quizzes.
[0011] It is an object of the present disclosure to provide a system where students can analyze things in a better way.
[0012] It is an object of the present disclosure to provide a system that fosters learning process of students.
SUMMARY
[0013] The present disclosure relates generally to field of digital system. More particularly, the present disclosure provides a system and method for augmented reality (AR) assisted learning.
[0014] An aspect of the present disclosure pertains to an augmented reality assisted learning system including an image recognition unit, an augmented reality (AR) engine and a display unit. The image recognition unit may be configured to scan one or more circuits, where the one or more circuits may be configured to accommodate one or more components, and correspondingly generate a first set of signals. The augmented reality (AR) engine may be operatively coupled to the image recognition unit , where the AR engine may include one or more processors coupled with a memory, the memory storing instructions executable by the one or more processors. The AR engine may be configured to extract a second set of signals from the first set of signals, where the second set of signals may pertain to scanned one or more components of the one or more circuits. The AR engine may be configured to identify the scanned one or more components from at least one of the one or more circuits. The AR engine may be configured to match the one or more components of the least one of the one or more circuits with a dataset , where the dataset may include predetermined one or more circuits and correspondingly generate a third set of signals. The display unit may be operatively coupled with the AR engine and configured to display an AR view of function of the one or more components based on the third set of signals.
[0015] In an aspect, the image recognition unit may include any or a combination of camera, and scanner.
[0016] In an aspect, the one or more components may include any or a combination of resistor, light emitting diode, thermistor, and sensor.
[0017] In an aspect, the display unit may be associated with a mobile computing unit, where the mobile computing unit may be related to a first set of entities, where the first set of entities may include any or a combination of teacher, instructor, and professor.
[0018] In an aspect, the AR engine may be communicatively coupled with the one or more mobile computing devices through a communication module, and where the displayed function of the one or more components of at least one of the one or more circuits may be transmitted to the one or more mobile computing devices.
[0019] In an aspect, the one or more mobile computing devices may be associated with a second set of entities , and where the second set of entities may include any or a combination of student, learner, pupil, scholar and observer.
[0020] In an aspect, the one or more circuits may include any or a combination of embedded circuit, and integrated circuit.
[0021] In an aspect, the system may include a power source operatively coupled with the display unit and the image recognition unit, where the power source may be configured to supply electric power to the system.
[0022] In an aspect, the power source may include any or a combination of battery, cell, inductor, and power line.
[0023] Another aspect of the present disclosure pertains to an augmented reality assisted learning method including steps of scanning, by an image recognition unit, one or more circuits, where the one or more circuits may be configured to accommodate one or more components , and the image recognition unit correspondingly generate a first set of signals, extracting, by an augmented reality (AR) engine, a second set of signals from the first set of signals , where the second set of signals may pertain to scanned one or more components of the one or more circuits, identifying, by the AR engine, the scanned one or more components from the one or more circuits, matching by the AR engine, the one or more components of the one or more circuits with a dataset, where the dataset may include predetermined one or more circuits and correspondingly generate a third set of signals, displaying, by the display unit, an AR view of function of the one or more components based on the third set of signals at a display unit.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0025] The diagrams are for illustration only, which thus is not a limitation of the present disclosure, and wherein:
[0026] FIG. 1 illustrates a block diagram of proposed augmented reality (AR) assisted learning system, in accordance with an embodiment of the present disclosure.
[0027] FIG. 2 illustrates exemplary functional components of an augmented reality (AR) engine of the proposed augmented reality (AR) assisted learning system, in accordance with an embodiment of the present disclosure.
[0028] FIG. 3 illustrates an exemplary view of the proposed augmented reality (AR) assisted learning system, in accordance with an embodiment of the present disclosure.
[0029] FIG. 4 illustrates a flow diagram illustrating a proposed augmented reality (AR) assisted learning method, in accordance with embodiments of the present disclosure.
DETAIL DESCRIPTION
[0030] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[0031] If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0032] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0033] The present disclosure relates generally to field of digital system . More particularly, the present disclosure provides a system and method for augmented reality (AR) assisted learning.
[0034] FIG. 1 illustrates a block diagram of proposed augmented reality (AR) assisted learning system, in accordance with an embodiment of the present disclosure.
[0035] As illustrated in FIG. 1, the proposed system 100 (also refereed to as system100, herein) can include an image recognition unit 102, an augmented reality (AR) engine 104 and a display unit 106. The AR engine 104 can be operatively coupled with the image recognition unit 102 and the display unit 106. The system 100 can facilitate AR assisted learning for a second set of entities through a first set of entities, where the first set of entities can include any or a combination of teacher, professor instructor, and the likes and the second set of entities can include any or a combination of student, scholar, learner, observer, and the likes.
[0036] In an embodiment, the image recognition unit 102 can be configured to scan one or more circuits, where the one or more circuits are configured to accommodate one or more components 108, and correspondingly generate a first set of signals. In an illustrative embodiment, the image recognition unit 102 can include any or a combination of scanner, camera, and the likes. The first set of signals generated by the image recognition unit 102 can be in electrical form. The first set of signals can be transmitted to the AR engine 104 in electrical form. In another illustrative embodiment, the one or more components 108 can include any or a combination of resistor, light emitting diode, thermistor, sensor, and the likes.
[0037] In an embodiment, the AR engine 104 can include one or more processors coupled with a memory, the memory storing instructions executable by the one or more processors. The AR engine 104 can be configured to extract a second set of signals from the first set of signals, where the second set of signals can pertain to scanned one or more components 108 of the one or more circuits. In another embodiment, the AR engine 104 can be configured to identify the scanned one or more components 108 from the one or more circuits. In yet another embodiment, the AR engine 104 can be configured to match the one or more components 108 of the one or more circuits with a dataset, where the dataset can include predetermined circuits and the correspondingly generate a third set of signals.
[0038] In an illustrative embodiment, the AR engine 104 can be configured to receive the first set of signals in electrical form and convert the first set of signals in machine readable form or binary form. In another illustrative embodiment, the AR engine 194 can include any or a combination of microprocessor, microcontroller, Arduino Uno, At mega 328, and other similar processing unit, but not limited to the likes. In yet another illustrative embodiment, the third set of signals generated by the AR engine 104 can be transmitted to the display unit 106 in machine readable form.
[0039] In an embodiment, the display unit 106 can be configured to display an AR view of function of the one or more components 108 based on the third set of signals. In an illustrative embodiment, the display unit 106 can be associated with a mobile computing unit, where the mobile computing unit can be related to the first set of entities. In another illustrative embodiment, the first set of entities can include any or a combination of teacher, instructor, professor, and the likes.
[0040] In an embodiment, the AR engine 104 can be communicatively coupled with the one or more mobile computing devices through a communication module, and where the displayed function of the one or more components of at least one of the one or more circuits can be transmitted to the one or more mobile computing devices. In another embodiment, the one or more mobile computing devices can be associated with the second set of entities. In an illustrative embodiment, the one or more circuits can include any or a combination of embedded circuit, integrated circuit, and the likes.
[0041] FIG. 2 illustrates exemplary functional components of an augmented reality (AR) engine of the proposed augmented reality (AR) assisted learning system, in accordance with an embodiment of the present disclosure.
[0042] As illustrated in an embodiment, the AR engine 104 can include one or more processor(s) 202. The one or more processor(s) 202 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute computer-readable instructions stored in a memory 204 of the AR engine 104. The memory 204 can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 204 can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0043] In an embodiment, the AR engine 104 can also include an interface(s) 206. The interface(s) 206 may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 may facilitate communication of the AR engine 104 with various devices coupled to the AR engine 104. The interface(s) 206 may also provide a communication pathway for one or more components of AR engine 104. Examples of such components include, but are not limited to, processing engine(s) 208 and data 210.
[0044] In an embodiment, the processing engine(s) 208 can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 208. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 208 may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 208 may include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) 208. In such examples, the AR engine 104 can include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to AR engine 104 and the processing resource. In other examples, the processing engine(s) 208 may be implemented by electronic circuitry. A database 210 can include data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 208.
[0045] In an embodiment, the processing engine(s) 208 can include an extraction unit 212, an identification unit 214, a matching unit 216, and other unit (s) 218. The other unit(s) 218 can implement functionalities that supplement applications or functions performed by the system 100 or the processing engine(s) 208.
[0046] The database 210 can include data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 208.
[0047] It would be appreciated that units being described are only exemplary units and any other unit or sub-unit may be included as part of the system 100. These units too may be merged or divided into super- units or sub-units as may be configured.
[0048] As illustrated in FIG. 2, the AR engine 104 can be configured to receive a first set of signals from an image recognition unit 102 in electrical form. The AR engine 104 can be configured to extract a second set of signals from the first set of signals with help of the extraction unit 212, where the second set of signals can pertain to scanned one or more components 108 of the one or more circuits. In an embodiment, the AR engine 104 can be configured to identify the scanned one or more components from at least one of the one or more circuits with help of the identification unit 214. In another embodiment, the AR engine 104 can be configured to match the one or more components 108 of the at least one of the one or more circuits with a dataset with help of the matching unit 216 , where the dataset can include predetermined one or more circuits and correspondingly generate a third set of signals.
[0049] In an embodiment, the extraction unit 212 can be configured to receive the first set of signals in electrical form and convert the first set of signals in machine readable from and extract the second set if signals in machine readable form, where the second set of signals can pertain to scanned one or more components 108 of the one or more circuits, where the one or more components 108 of the one or more circuits can be scanned through an image recognition unit 102. In an illustrative embodiment, the extraction unit 212 can be configured to transmit the extracted second set of signals to the identification unit 214
[0050] In an embodiment, the second set of signals can be received by the identification unit 214 in machine readable form and identify the scanned one or more components from at least one of the one or more circuits. The identification unit 214 after indentifying the one or more components 108 from the at least one of the one or more circuits, can transmit the identified one or more components 108 to the matching unit 216 in machine readable form.
[0051] In an embodiment, the identified one or more components 108 from the at least one of the one or more circuits can be received by the matching unit 216 in machine readable form. The matching unit 216 can be configured to match the one or more components 108 of the least one of the one or more circuits with a dataset , where the dataset includes predetermined one or more circuits and correspondingly generate a third set of signals. The predetermined one or more circuits can be stored in the database 210. In another embodiment, the matching unit 216 can be configured to generate the third set of signals in machine readable form and transmit the third set of signals to a display unit 106.
[0052] FIG. 3 illustrates an exemplary view of the proposed augmented reality (AR) assisted learning system, in accordance with an embodiment of the present disclosure.
[0053] As illustrated in FIG. 3, the system 100 can include a display unit 106, an image recognition unit 102 like a camera 102-1 and an augmented reality (AR) marker 102-2 and a first set of entities like a user 302. In an embodiment, the image recognition unit 102 can be configured to scan one or more circuits, where the one or more circuits can be configured to accommodate one or more components, and correspondingly generate a first set of signals. The image recognition unit 102 can include any or a combination of AR maker. The image recognition unit 102 can be operatively coupled with an AR engine 104, where the AR engine can be configured to extract a second set of signals from the first set of signals, identify the scanned one or more components 108 from at least one of the one or more circuits, match the one or more components of the least one of the one or more circuits with a dataset and correspondingly generate a third set of signals. The third set of signals can be transmitted to the display unit 106.
[0054] In an embodiment, the display unit 106 can be operatively coupled with the AR engine 104 and configured to display an AR view of function of the one or more components 108 based on the third set of signals. In an illustrative embodiment, the one or more components 108 can include any or a combination of resistor, light emitting diode, thermistor, sensor, and the likes. In another illustrative embodiment, the one or more circuits can include any or a combination of embedded circuit, integrated circuit, and the likes.
[0055] In an embodiment, the display unit 106 can be associated with a mobile computing unit, where the mobile computing unit can be related to a first set of entities, where the first set of entities can include any or a combination of teacher, instructor, professor, and the likes. In another embodiment, the AR engine 104 can be communicatively coupled with the one or more mobile computing devices through a communication module, and where the displayed function of the one or more components 108 of at least one of the one or more circuits can be transmitted to the one or more mobile computing devices.
[0056] In an illustrative embodiment, the one or more mobile computing devices can be associated with a second set of entities , and where the second set of entities can include any or a combination of student, learner, pupil, scholar, observer, and the likes. In another illustrative embodiment, the system 100 can include a power source operatively coupled with the display unit 106 and the image recognition unit 102, where the power source can be configured to supply electric power to the system 100. In yet another illustrative embodiment, the power source can include any or a combination of battery, cell, inductor, power line, and the likes.
[0057] In an illustrative embodiment, the system 100 can facilitate AR assisted learning for the second set of entities through the first set of entities. The second set of entities can receive the displayed function of the one or more components 108 from the at least one of the one or more circuits, with help of the one or more mobile computing devices through the communication module. The communication module can include any or a combination of Wireless Fidelity (Wi-Fi) module , Bluetooth module, Li-Fi module, optical fiber, Wireless Local Area Network (WLAN), and ZigBee module and the likes.
[0058] In an illustrative embodiment, the system 100 can help the second set of entities in understanding the one or more components of the one or more circuits efficiently. The displayed function of the one or more components 108 of the at least one of the one or more circuits through the one or more mobile computing devices can enable virtual learning. In another illustrative embodiment, the system 100 can facilitate visualizing the one or more components 108 functions and internal operation of the one or more circuits .
[0059] FIG. 4 illustrates a flow diagram illustrating a proposed augmented reality (AR) assisted learning method, in accordance with embodiments of the present disclosure.
[0060] As illustrated in FIG. 4, the method can include a step 402 of scanning, by an image recognition unit, one or more circuits, wherein the one or more circuits are configured to accommodate one or more components , and the image recognition unit correspondingly generate a first set of signals.
[0061] In an embodiment, the method can include a step 404 of extracting, by an augmented reality (AR) engine, a second set of signals from the first set of signals , wherein the second set of signals pertain to scanned one or more components of the one or more circuits.
[0062] In an embodiment, the method can include a step 406 of identifying, by the AR engine, the scanned one or more components from the one or more circuits.
[0063] In an embodiment, the method can include a step 408 of matching, by the AR engine, the one or more components of the one or more circuits with a dataset, wherein the dataset includes predetermined circuits and correspondingly generate a third set of signals;
[0064] In an embodiment, the method can include a step 410 of displaying, by the display unit, an AR view of function of the one or more components based on the third set of signals at a display unit.
[0065] Thus, it will be appreciated by those of ordinary skill in the art that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular name.
[0066] While embodiments of the present invention have been illustrated and described, it will be clear that the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the invention, as described in the claim.
[0067] In the foregoing description, numerous details are set forth. It will be apparent, however, to one of ordinary skill in the art having the benefit of this disclosure, that the present invention may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form, rather than in detail, to avoid obscuring the present invention.
[0068] As used herein, and unless the context dictates otherwise, the term "coupled to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously. Within the context of this document terms "coupled to" and "coupled with" are also used euphemistically to mean “communicatively coupled with” over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.
[0069] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, ` components, or steps that are not expressly referenced.
[0070] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0071] The present disclosure provides a system that facilitates increasing self- learning capability of students and other interested entity.
[0072] The present disclosure provides a system that helps in giving an interactive environment for learning.
[0073] The present disclosure provides a system with a three dimensional generated visualization of different components like resistor, sensor, light emitting diode and the like embedded on development board.
[0074] The present disclosure provides a system that enables self-assessment for students using quick knowledge check quizzes.
[0075] The present disclosure provides a system where students can analyze things in a better way.
[0076] The present disclosure provides a system that fosters learning process of students.
Claims:1. An augmented reality assisted learning system (100) comprising:
an image recognition unit (102) configured to scan one or more circuits, wherein the one or more circuits are configured to accommodate one or more components (108) , and correspondingly generate a first set of signals;
an augmented reality (AR) engine (104) operatively coupled to the image recognition unit (102) , wherein the AR engine (104) including one or more processors coupled with a memory, the memory storing instructions executable by the one or more processors configured to:
extract a second set of signals from the first set of signals, wherein the second set of signals pertain to scanned one or more components (108) of the one or more circuits;
identify the scanned one or more components (108) from at least one of the one or more circuits;
match the one or more components (108) of the least one of the one or more circuits with a dataset , wherein the dataset includes predetermined one or more circuits and correspondingly generate a third set of signals;
a display unit (106) operatively coupled with the AR engine (104) and configured to display an AR view of function of the one or more components (108) based on the third set of signals.
2. The system (100) as claimed in claim 1, wherein the image recognition unit (102) includes any or a combination of camera, and scanner
3. The system (100) as claimed in claim 1, wherein the one or more components (108) include any or a combination of resistor, light emitting diode, thermistor, sensor,
4. The system (100) as claimed in claim 1, wherein the display unit (106) is associated with a mobile computing unit, wherein the mobile computing unit is related to a first set of entities, wherein the first set of entities include any or a combination of teacher, instructor, and professor.
5. The system (100) as claimed in claim 1, wherein the AR engine (104) is communicatively coupled with the one or more mobile computing devices through a communication module , and wherein the displayed function of the one or more components (108) of at least one of the one or more circuits are transmitted to the one or more mobile computing devices.
6. The system (100) as claimed in claim 5, wherein the one or more mobile computing devices are associated with a second set of entities , and wherein the second set of entities include any or a combination of student, learner, pupil, scholar and observer.
7. The system (100) as claimed in claim 1, wherein the one or more circuits include nay or a combination of embedded circuit, and integrated circuit.
8. The system (100) as claimed in claim 1, wherein the system (100) includes a power source operatively coupled with the display unit (106) and the image recognition unit (102), wherein the power source is configured to supply electric power to the system (100).
9. The system (100) as claimed in claim 8, wherein the power source includes any or a combination of battery, cell, inductor, and power line.
10. An augmented reality assisted learning method comprising steps of:
scanning, by an image recognition unit (102), one or more circuits, wherein the one or more circuits are configured to accommodate one or more components (108), and the image recognition unit correspondingly generate a first set of signals;
extracting, by an augmented reality (AR) engine (104), a second set of signals from the first set of signals , wherein the second set of signals pertain to scanned one or more components (108) of the one or more circuits;
identifying, by the AR engine (104), the scanned one or more components (108) from the one or more circuits;
matching by the AR engine (104), the one or more components (108) of the one or more circuits with a dataset, wherein the dataset includes predetermined circuits and correspondingly generate a third set of signals;
displaying, by the display unit (106), an AR view of function of the one or more components (108) based on the third set of signals at a display unit.
| # | Name | Date |
|---|---|---|
| 1 | 202011040829-Proof of Right [08-10-2020(online)].pdf | 2020-10-08 |
| 2 | 202011040829-FORM-26 [08-10-2020(online)].pdf | 2020-10-08 |
| 3 | 202011040829-FORM 18 [21-06-2022(online)].pdf | 2022-06-21 |
| 4 | 202011040829-Response to office action [26-09-2022(online)].pdf | 2022-09-26 |
| 5 | 202011040829-FER.pdf | 2022-10-06 |
| 6 | 202011040829-FER_SER_REPLY [16-03-2023(online)].pdf | 2023-03-16 |
| 7 | 202011040829-CORRESPONDENCE [16-03-2023(online)].pdf | 2023-03-16 |
| 8 | 202011040829-CLAIMS [16-03-2023(online)].pdf | 2023-03-16 |
| 1 | SearchHistoryE_28-09-2022.pdf |