Sign In to Follow Application
View All Documents & Correspondence

Augmented Reality System For Stability Analysis Of Control Systems

Abstract: An augmented reality system (100) for stability analysis of control system, comprising: a user device (102) used by learners to select an input; a graph marker (110) represents a s-plane having movable markers (112); a camera (114) to capture images of the movable markers (112); a processing unit (128) draws a plot corresponding to time domain stability analysis or frequency domain stability analysis on the graph marker (110) based on the selected input; receives the captured images from the camera (114); compares the captured images with image markers to identify a location of the movable markers (112); generates a virtual content to be overlaid on a real content; displays augmented content representing stability analysis of the control system on the user device (102); and sensors (124) adapted to capture a motion of the movable markers (112) within an area of interest when the camera (114) fails to capture vision-based inputs.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
06 January 2022
Publication Number
27/2023
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector - 9C, Madhya Marg, Chandigarh - 160009, India Email Id: sachin.ahuja@chitkara.edu.in Mb No. : 9217730035

Inventors

1. Deepti Prit Kaur
Chitkara University, Punjab
2. Archana Mantri
Chitkara University, Punjab
3. Narinder Pal Singh
Chitkara University, Punjab
4. Priyanka Malhotra
Chitkara University, Punjab
5. Harsimranjit Kaur
Chitkara University, Punjab

Specification

Embodiments of the present invention generally relate to an augmented reality system and particularly to an augmented reality system for stability analysis of control systems.
Description of Related Art
[002] Augmented Reality (AR) is a computer graphics method that blends a virtual object or information into a real-world environment, giving an impression that a thing is present in an actual world. The AR has been extensively used to address difficulties of students in learning concepts through interactive visualization. Similarly, specifically for an engineering domain, an analysis of control systems is a foremost requirement. In existing systems, the students are able to model, simulate and perform computations for linear/non-linear complex dynamic systems. However, none of these systems provide the interactive visualization in real time. Moreover, all the existing systems require licensed system software with a dedicated system hardware, and require more memory in a hard disc.
[003] To overcome the aforementioned issues, various marker-based Augmented Reality Learning Environments (ARLEs), are available that rely on vision tracking methods such as, cameras that captures a real scene. The real scene comprises stationary and/or movable markers for which an identification is done through computer vision algorithms. However, the identification of correct markers can become difficult in case of identical/similar markers or due to occlusions. This situation makes it difficult to recognize an actual feature for which virtual data was intended to be overlaid. Further, various marker-less techniques are also available for object/scene identification for AR based systems, however these require complex algorithms for feature detection and are not suitable for

application in education field.
[004] There is thus a need for an advanced and more effective augmented reality system that can perform stability analysis of the control systems in a more efficient manner.
SUMMARY
[005] Embodiments in accordance with the present invention provide an augmented reality system for stability analysis of a control system. The system comprising: a user device used by learners to provide an input selected from a time domain stability analysis, a frequency domain stability analysis, or a combination thereof through an Augmented Reality (AR) application. The system further comprising: a graph marker mounted on a board, such that the graph marker represents a s-plane having movable markers such that the movable markers are poles of the control system. The system further comprising: a camera connected to a camera stand of a pre-defined height, to capture images of the movable markers located on the graph marker. The system further comprising: a processing unit configured to: draw a plot corresponding to the time domain stability analysis or the frequency domain stability analysis on the graph marker based on the provided input; receive the captured images of the movable markers located on the drawn plot of the control system from the camera; compare the captured images of the movable markers with image markers stored in a database to identify a location of the movable markers in the control system; generate a virtual content to be overlaid on a real content as captured by the camera based on the identified location of the movable markers; and display augmented content on the user device such that augmented content represents the stability analysis of the control system.
[006] Embodiments of the present invention may provide a number of advantages depending on its particular configuration. First, embodiments of the present application may provide an augmented reality system that

relies on sensor inputs to ensure accurate uninterrupted working of the system even in case of failure from vision-inputs related to identical multiple markers or occlusions.
[007] Next, embodiments of the present invention may provide an augmented reality system that is capable of identifying features accurately from real-world inputs for analyzing a control system.
[008] Next, embodiments of the present invention may provide an augmented reality system that enables a learner to achieve interactive visualization for understanding complicated system concepts, with a low-cost.
[009] Next, embodiments of the present invention may provide an augmented reality system that enables a learner to use hybrid-AR based application, without requiring any dedicated system hardware set-up.
[0010] These and other advantages will be apparent from the present application of the embodiments described herein.
[0011] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The above and still further features and advantages of embodiments of the present invention will become apparent upon

consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0013] FIG. 1 illustrates an augmented reality system for stability analysis of a control system, according to an embodiment of the present invention;
[0014] FIG. 2A illustrates components of an Augmented Reality (AR) platform of the augmented reality system for stability analysis of a control system, according to an embodiment of the present invention;
[0015] FIG. 2B illustrates components of a control unit of the augmented reality system for stability analysis of a control system, according to an embodiment of the present invention; and
[0016] FIG. 3 depicts a flowchart of a method for displaying real physical data with augmented information by using the augmented reality system, according to an embodiment of the present invention.
[0017] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words "include", "including", and "includes" mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description

of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0019] In any embodiment described herein, the open-ended terms "comprising", "comprises", and the like (which are synonymous with "including", "having", and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", "consists essentially of", and the like or the respective closed phrases "consisting of, "consists of", the like.
[0020] As used herein, the singular forms "a", "an", and "the" designate both the singular and the plural, unless expressly stated to designate the singular only.
[0021] FIG. 1 illustrates an augmented reality system 100 (hereinafter referred to as the system 100) for stability analysis of a control system, according to an embodiment of the present invention. In an embodiment of the present invention, the control system may be an open loop control system, a closed loop control system, and so forth. In an embodiment of the present invention, the system 100 may be configured to enable learners to achieve interactive visualization for understanding complicated concepts of electronics engineering with a low-cost. In an embodiment of the present invention, the system 100 may also be configured to enhance learning experience of the learners as the system 100 influences spatial ability of the learners for real time visualization.

[0022] According to embodiments of the present invention, the system 100 may comprise a user device 102 that may be a device used by the learners to select an input for performing the stability analysis of the control system. In an embodiment of the present invention, the input may be selected from a time domain stability analysis, a frequency domain stability analysis, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the input selected by the user for performing the stability analysis. As used herein, the term "time domain stability analysis" refers to an analysis of the control system that involves defining input, output and other variables of the control system as a function of time. In an embodiment of the present invention, the time domain stability analysis may be performed by various time domain analysis techniques such as, but not limited to, Routh Hurtiz criteria, Root locus plot, and so forth. In a preferred embodiment of the present invention, the time domain analysis techniques may be a pole-zero plot. Embodiments of the present invention are intended to include or otherwise cover any type of the time domain analysis techniques including known related art and/or later developed technologies.
[0023] Further, as used herein, the term "frequency domain stability analysis" refers to an analysis of the control system where a sinusoidal signal is given as an input to the control system with different frequencies and response of an output is determined at different frequencies. In an embodiment of the present invention, the frequency domain stability analysis may be performed by various frequency domain analysis techniques such as, but not limited to, a Bode plot, a Polar Plot, and so forth. In a preferred embodiment of the present invention, the frequency domain analysis techniques may be a Nyquist plot. Embodiments of the present invention are intended to include or otherwise cover any type of the frequency domain analysis techniques including known related art and/or later developed technologies.

[0024] Further, in an embodiment of the present invention, the user device 102 may also enable the learners to provide topics associated with the control system. The user device 102 may also enable the learners to view real physical data with augmented information on a display of the user device 102. In an embodiment of the present invention, the user device 102 may be, but not limited to, a laptop, a mobile phone, a smart phone, a tablet, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the user device 102 including known related art and/or later developed technologies.
[0025] According to an embodiment of the present invention, the user device 102 may comprise software applications such as, but not limited to, a navigation application, a camera application, a media player application, a social networking application, and the like. In a preferred embodiment of the present invention, the user device 102 may comprise an Augmented Reality (AR) application 104 that may be a computer readable program installed on the user device 102 for enabling the learners to select the input for performing the stability analysis of the control system.
[0026] Further, the user device 102 may comprise a user interface 106 configured to enable the learners to interact with the AR application 104 installed within the user device 102, according to an embodiment of the present invention. The user interface 106 may be, but not limited to, a digital display, a touch screen display, a graphical user interface, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the user interface 106 including known, related art, and/or later developed technologies.
[0027] Further, the system 100 may comprise a board 108 that may be adapted to provide a support to components of the system 100, in an embodiment of the present invention. The components may be, but not limited to, a graph marker 110, movable markers 112, and so forth. In a

preferred embodiment of the present invention, the board 108 may be made up of a material such as, wood. Embodiments of the present invention are intended to include or otherwise cover any type of the material of the board 108 including known, related art, and/or later developed technologies.
[0028] Further, in an embodiment of the present invention, the graph marker 110 may represent a s-plane, on which, poles may be located based on a transfer function provided for the control system under consideration. In an embodiment of the present invention, the s-plane may be a plane with x-axis as real axis and y-axis as imaginary axis. In an embodiment of the present invention, the pole-zero plot may be drawn on the s-plane, when the selection for the time domain stability analysis is made by the learner. In another embodiment of the present invention, the Nyquist plot may be drawn from pole locations that may be obtained from the transfer function, when the selection for the frequency domain stability analysis is made by the learner. In such embodiment of the present invention, the Nyquist plot may be drawn to find the stability of the closed loop control system using a Nyquist stability criterion. In an exemplary scenario, if a critical point (-1+J0) lies outside an encirclement, then the closed loop control system is absolutely stable.
[0029] Further, the movable markers 112 may be X-shaped movable targets that may represent the poles of the control system, in an embodiment of the present invention. In an embodiment of the present invention, the poles of the control system may determine the stability and other properties of the control system based on a location on the graph marker 110.
[0030] Further, in an embodiment of the present invention, the system 100 may further comprise a camera 114 that may be connected to a camera stand 116. The camera stand 116 may be affixed to the board 108, in an

embodiment of the present invention. In an embodiment of the present invention, the camera 114 may be installed at a pre-defined height and at a pre-defined angle on the camera stand 116, so that the camera 114 may be able to capture images of the movable markers 112 located on the graph marker 110. In an embodiment of the present invention, the camera stand 116 may be provided with an adjustable knob 118 to enable a user to adjust an angle of the camera 114. Further, the camera stand 116 may be provided with an adjustment guide 120 to enable the user to adjust a height of the camera stand 116 which in further adjusts a height of the camera 114. Further, in an embodiment of the present invention, the camera stand 116 may be provided with an adjustable camera holder 122 to hold the camera 114 at a required angle.
[0031] According to embodiments of the present invention, the camera 114 may be, but not limited to, a Close-Circuit Television (CCTV) camera, a still camera, a video camera, a color balancer camera, a thermal camera, an infrared camera, a telephoto camera, a wide-angle camera, a macro camera, and so forth. In a preferred embodiment of the present invention, the camera 114 may be a Universal Serial Bus (USB) camera. Embodiments of the present invention are intended to include or otherwise cover any type of the camera 114, including known, related art, and/or later developed technologies. Further, in an embodiment of the present invention, the camera 114 may be configured to transmit the captured images of the movable markers 112 to the AR application 104 installed within the user device 102.
[0032] Further, the system 100 may comprise sensors 124 that may be mounted on the board 108, to capture a motion of the movable markers 112 within an area of interest, in an embodiment of the present invention. The sensors 124 may be configured to capture the motion of the movable markers 112, when the camera 114 fails to capture vision inputs due to various issues such as, but not limited to, identical multiple markers,

occlusions, and so forth. The sensors 124 may be configured to transmit the captured motion of the movable markers 112 to a control unit 130, in an embodiment of the present invention. In a preferred embodiment of the present invention, the sensors 124 may be Light Dependant Resistors (LDR). Embodiments of the present invention are intended to include or otherwise cover any type of the sensors 124 including known, related art, and/or later developed technologies.
[0033] In an embodiment of the present invention, the system 100 may comprise an Augmented Reality (AR) platform 126 that may be connected to the user device 102 through a communication network (not shown). According to an embodiment of the present invention, the communication network may be a data network such as, but not limited to, the Internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Metropolitan Area Network (MAN), and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the data network, including known, related art, and/or later developed technologies. In some embodiments of the present invention, the communication network may be a wireless network, such as, but not limited to, a cellular network and may employ various technologies including an Enhanced Data Rates for Global Evolution (EDGE), a General Packet Radio Service (GPRS), and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the wireless network, including known, related art, and/or later developed technologies.
[0034] In an embodiment of the present invention, the AR platform 126 may be computer readable instructions of the AR application 104 that may be stored onto a memory (not shown) and configured to control operations of the system 100. In an embodiment of the present invention, the computer readable instructions may be an image processing algorithm that may be a vision-based tracking. The image processing algorithm may be executed by a processing unit 128 of the AR platform 126 for identifying

the location of the movable markers 112 in the control system. Further, a working of the AR platform 126 will be explained in detail in conjunction with FIG. 2A.
[0035] Further, the processing unit 128 may be configured to execute computer executable instructions stored in the memory to identify the location of the movable markers 112 in the control system. The processing unit 128 may also be configured to execute the computer executable instructions for performing the stability analysis of the control system based on the identified location of the movable markers 112. The processing unit 128 may be, but not limited to, a Programmable Logic Control unit (PLC), a microprocessor, a computing device, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the processing unit 128 including known, related art, and/or later developed technologies.
[0036] Further, in an embodiment of the present invention, the control unit 130 may be embedded within the board 108. In another embodiment of the present invention, the control unit 130 may be connected to the sensors 124 mounted on the board 108. The control unit 130 may be actuated based on a signal received from the sensors 124, in an embodiment of the present invention. The signal may be the captured motion of the movable markers 112. The control unit 130 may be configured to execute computer readable instructions stored in the memory to identify the location of the movable markers 112, in an embodiment of the present invention. In an embodiment of the present invention, the computer readable instructions may be a computer vision algorithm that may be a sensor-based tracking.
[0037] The control unit 130 may be, but not limited to, a microprocessor, a development board, a digital signal processor, and alike. In a preferred embodiment of the present invention, the control unit 130 may be a

microcontroller. Embodiments of the present invention are intended to include or otherwise cover any type of the control unit 130, including known, related art, and/or later developed technologies. Further, a working of the control unit 130 will be explained in detail in conjunction with FIG. 2B.
[0038] FIG. 2A illustrates components of the AR platform 126 of the system 100, according to an embodiment of the present invention. The components may be, but not limited to, an input receiving module 200, a plotting module 202, an image receiving module 204, a comparison module 206, and a rendering module 208.
[0039] The input receiving module 200 may be configured to receive the input selected by the learners through the AR application 104 from the user device 102. The input receiving module 200 may also be configured to receive the topics associated with the control system from the user device 102. Further, the input receiving module 200 may be configured to transmit the received input and the topics associated with the control system to the plotting module 202.
[0040] The plotting module 202 may be configured to draw a plot based on the input selected by the learners, in an embodiment of the present invention. In an embodiment of the present invention, the plotting module 202 may be configured to draw the Nyquist plot from the pole locations when the frequency domain stability analysis as the input is selected by the learners. In such embodiment of the present invention, the Nyquist plot may be served as a real scene. In another embodiment of the present invention, the plotting module 202 may be configured to draw the pole-zero plot when the time domain stability analysis as the input is selected by the learners. In such embodiment of the present invention, the pole-zero plot may be served as a real scene.
[0041] Further, in an embodiment of the present invention, the image

receiving module 204 may be configured to receive the captured images of the movable markers 112 located on the drawn plot from the camera 114. In an embodiment of the present invention, the image receiving module 204 may be configured to transmit the received captured images of the movable markers 112 to the comparison module 206.
[0042] The comparison module 206 may be configured to compare the captured images of the movable markers 112 with image targets and/or markers stored as three cases in a database. The three cases may be stable, unstable and neutral. The comparison module 206 may be configured to compare the captured images of the movable markers 112 with the image targets and/or markers to identify the location of the movable markers 112 in the control system. Further, the comparison module 206 may be configured to transmit the identified location of the movable markers 112 to the rendering module 208.
[0043] The rendering module 208 may be configured to generate a virtual content to be overlaid on a real content as captured by the camera 114 based on the identified location of the movable markers 112, in an embodiment of the present invention. The virtual content may be a control system stability that may be governed by the location of the movable markers 112. The rendering module 208 may be configured to display the augmented content on the user device 102, in an embodiment of the present invention. In an embodiment of the present invention, the augmented content may represent the stability analysis of the control system.
[0044] FIG. 2B illustrates components of the control unit 130 of the system 100, according to an embodiment of the present invention. The components may be, but not limited to, a data receiving module 210, a processing module 212, and an output module 214.
[0045] In an embodiment of the present invention, the data receiving

module 210 may be configured to receive the captured motion of the movable markers 112 from the sensors 124. In an embodiment of the present invention, the data receiving module 210 may be configured to transmit the received captured motion of the movable markers 112 to the processing module 212.
[0046] The processing module 212 may be configured to identify the location of the movable markers 112 based on the captured motion of the movable markers 112 by using a computer vision technique. Further, the processing module 212 may be configured to transmit the identified location of the movable markers 112 to the output module 214, in an embodiment of the present invention.
[0047] The output module 214 may be configured to generate a virtual content to be overlaid on a real content as captured by the sensors 124 based on the identified location of the movable markers 112, in an embodiment of the present invention. The virtual content may be the control system stability that may be governed by the location of the movable markers 112. The output module 214 may be configured to display the augmented content on the user device 102. In an embodiment of the present invention, the augmented content may represent the stability analysis of the control system.
[0048] FIG. 3 depicts a flowchart of a method 300 for displaying the real physical data with augmented information by using the system 100, according to an embodiment of the present invention.
[0049] At step 302, the system 100 may enable the learners to select the topics associated with the control system through the AR application 104.
[0050] At step 304, the system 100 may actuate the camera 114 to capture the images of the drawn plot from the board 108.

[0051] At step 306, the system 100 may check if the graph marker 110 is detected by the camera 114. The method 300 may proceed to a step 308, if the graph marker 110 is not found. Otherwise, the method 300 may proceed to a step 310.
[0052] At the step 308, the system 100 may display an error message "target not found".
[0053] At the step 310, the system 100 may augment the graph marker 110 with the input selected by the learner. The input may be, the time domain stability analysis or the frequency domain stability analysis.
[0054] At step 312, the system 100 may check if the movable markers 112 are detected by the camera 114. The method 300 may proceed to a step 314, when the movable markers 112 are not detected by the camera 114. Otherwise, the method 300 may proceed to a step 316.
[0055] At the step 314, the system 100 may check if the motion of the movable markers 112 is captured by the sensors 124. The method 300 may return to the step 308, when the motion of the movable markers 112 is not captured by the sensors 124. Otherwise, the method 300 may proceed to the step 316.
[0056] At the step 316, the system 100 may check if the location of the movable markers 112 is identified or not. The method 300 may proceed to a step 318, when the location of the movable markers 112 is not identified by the system 100. Otherwise, the method 300 may proceed to a step 320.
[0057] At the step 318, the system 100 may enable the user to readjust the movable markers 112 on the plot of the control system.
[0058] At the step 320, the system 100 may overlay the virtual content on the real content as captured by the camera 114 and/or the sensors 124 based on the identified location of the movable markers 112.

[0059] At step 322, the system 100 may ask the learner if the learner wants to analyze more cases of the control system. The method 300 may proceed to a step 324, when the learner wants to analyze the more cases of the control system. Otherwise, the method 300 may conclude.
[0060] At the step 324, the system 100 may redirect the learner to a main menu to select a next topic associated with the control system.
[0061] Embodiments of the invention are described above with reference to block diagrams and schematic illustrations of methods and systems according to embodiments of the invention. While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
[0062] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims.

CLAIMS
I/We Claim:

1. An augmented reality system (100) for stability analysis of a control system, the system (100) comprising:
a user device (102) used by learners to provide an input selected from a time domain stability analysis, a frequency domain stability analysis, or a combination thereof through an Augmented Reality (AR) application (104);
a graph marker (110) mounted on a board (108), such that the graph marker (110) represents a s-plane having movable markers (112) such that the movable markers (112) are poles of the control system;
a camera (114) connected to a camera stand (116) of a pre-defined height, to capture images of the movable markers (112) located on the graph marker (110); and
a processing unit (128) configured to:
Draw a plot corresponding to the time domain stability analysis or the frequency domain stability analysis on the graph marker (110) based on the provided input;
Receive the captured images of the movable markers (112) located on the drawn plot of the control system from the camera (114);
Compare the captured images of the movable markers (112) with image markers stored in a database to identify a location of the movable markers (112) in the control system;
Generate a virtual content to be overlaid on a real content as

captured by the camera (114) based on the identified location of the movable markers (112); and
Display augmented content on the user device (102) such that augmented content represents the stability analysis of the control system.
2. The system (100) as claimed in claim 1, wherein the time domain stability analysis is performed by a pole-zero plot technique.
3. The system (100) as claimed in claim 1, wherein the frequency domain stability analysis is performed by a Nyquist plot technique.
4. The system (100) as claimed in claim 1, wherein the user device (102) comprises a user interface (106) configured to enable the learners to interact with the Augmented Reality (AR) application (104) installed within the user device (102).
5. The system (100) as claimed in claim 1, wherein the camera stand (116) is provided with an adjustable knob (118) to enable a user to adjust an angle of the camera (114).
6. The system (100) as claimed in claim 1, wherein the camera stand (116) is provided with an adjustment guide (120) to enable a user to adjust a height of the camera stand (116).
7. The system (100) as claimed in claim 1, wherein the camera stand (116) is provided with an adjustable camera holder (122) to hold the camera (114) at a required angle.
8. The system (100) as claimed in claim 1, wherein the camera (114) is a Universal Serial Bus (USB) camera.
9. The system (100) as claimed in claim 1, further comprising sensors (124) mounted on the board (108), wherein the sensors (124) are

adapted to capture a motion of the movable markers (112) within an area of interest when the camera (114) fails to capture vision-based inputs.
10. The system (100) as claimed in claim 9, wherein the sensors (124) are Light Dependent Resistors (LDR).

Documents

Application Documents

# Name Date
1 202211000763-STATEMENT OF UNDERTAKING (FORM 3) [06-01-2022(online)].pdf 2022-01-06
2 202211000763-FORM FOR STARTUP [06-01-2022(online)].pdf 2022-01-06
3 202211000763-FORM FOR SMALL ENTITY(FORM-28) [06-01-2022(online)].pdf 2022-01-06
4 202211000763-FORM 1 [06-01-2022(online)].pdf 2022-01-06
5 202211000763-FIGURE OF ABSTRACT [06-01-2022(online)].jpg 2022-01-06
6 202211000763-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [06-01-2022(online)].pdf 2022-01-06
7 202211000763-EVIDENCE FOR REGISTRATION UNDER SSI [06-01-2022(online)].pdf 2022-01-06
8 202211000763-DRAWINGS [06-01-2022(online)].pdf 2022-01-06
9 202211000763-DECLARATION OF INVENTORSHIP (FORM 5) [06-01-2022(online)].pdf 2022-01-06
10 202211000763-COMPLETE SPECIFICATION [06-01-2022(online)].pdf 2022-01-06
11 202211000763-Proof of Right [17-02-2022(online)].pdf 2022-02-17