Sign In to Follow Application
View All Documents & Correspondence

Wireless Stylus (104) System For Teaching And Interacting With Virtual Interactive Screen On A Surface

Abstract: ABSTRACT Wireless Stylus System for Teaching and Interacting with Virtual Interactive Screen on a Surface The present invention relates to a stylus system designed for teaching and interacting with virtual interactive screens on a surface. The system comprises a stylus device, a sensor, a processor, and a projecting device. The stylus device is ergonomically designed and equipped with wireless capabilities for seamless communication with the virtual screen. A sensor within the stylus detects and captures the 's actions, including drawing, writing, and gesturing, on the virtual screen. A processor analyzes the movement patterns of the 's hand based on the detected actions, employing advanced algorithms for accurate interpretation. By mapping the movement patterns to a pre-defined list of contents, the processor determines the appropriate content to be projected onto the surface. A projecting device integrated into the stylus continuously projects the selected content, ensuring synchronized and immersive teaching and interaction experiences. The system enables intuitive and responsive experiences by accurately detecting actions, analysing movement patterns, selecting relevant content, and providing continuous projection.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 July 2023
Publication Number
47/2023
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Schoolnet India Limited
D-114, Okhla Industrial Area, Phase-I, New Delhi - 110020, India.

Inventors

1. Neeraj Kapoor
D-114, Okhla Industrial Area, Phase-I, New Delhi - 110020, India

Specification

Description:Wireless Stylus (104) System for Teaching and Interacting with Virtual Interactive Screen on a Surface
Field of the Invention
[0001]. The present invention relates to the field of interactive content projection and control. More specifically, it pertains to a system and method for projecting and controlling educational contents on a surface using a projecting device.

Background of the Invention
[0002]. Conventional techniques for teaching and interacting with virtual interactive screens on surfaces typically involve the use of physical input devices, such as touchscreens or computer mice. These devices require direct contact or proximity to the screen, limiting the 's freedom of movement and interaction. Additionally, conventional systems often lack intuitive and efficient methods for generating and projecting content onto the surface.
[0003]. Further, in the field of wireless interaction with virtual interactive screens on surfaces, conventional techniques often involve the use of physical input devices, such as touchscreens, keyboards, or computer mice. These devices require direct contact or proximity to the screen, limiting the 's flexibility and hindering natural interaction. Additionally, conventional systems may lack advanced features and functionality, leading to inefficiencies and limitations in experience.

Problems with Conventional Methods:
[0004]. Limited Interaction: Conventional touchscreens or computer mice restrict s to a limited range of interactions, such as tapping or dragging. This limited interaction capability can hinder the 's ability to effectively engage with the virtual interactive screen.
[0005]. Lack of Mobility: Users must physically touch or be close to the screen to interact, which limits their mobility and flexibility. This restriction can be particularly inconvenient in teaching environments where instructors and learners need freedom of movement.

[0006]. Content Generation Challenges: Creating and projecting content onto the surface in real-time can be cumbersome and time-consuming with conventional methods. Users often have to switch between different software applications or devices, leading to a disjointed and inefficient workflow.

[0007]. Inefficient Handwriting Recognition: Conventional systems struggle to accurately recognize and interpret free handwritten patterns. This can result in errors and make the process of digital text writing and annotation less efficient and accurate.

[0008]. Power Dependency: Many existing systems rely on battery-powered devices, leading to limitations in terms of battery life and dependence on external power sources. This can be inconvenient in situations where a continuous and uninterrupted teaching or presentation session is desired.

[0009]. Given the limitations and problems associated with conventional methods, there is a need for a new system that overcomes these challenges. The new system should provide enhanced interaction capabilities, improved mobility, seamless content generation, efficient handwriting recognition, and reduced power dependency. By addressing these issues, the new system will offer a more intuitive, flexible, and efficient way of teaching and interacting with virtual interactive screens on surfaces.

Summary of the Invention

[00010]. The present invention relates to a system and method for wirelessly interacting with a virtual interactive screen on a surface, offering enhanced experience and flexibility. The objective of the present invention is to overcome the limitations and problems associated with conventional techniques, such as limited range of interaction, lack of mobility, complex setup, cumbersome content generation, and limited gesture recognition.

[00011]. The system comprises a stylus (104) (104), a processor (103), and a computer-readable medium storing processor (103)-executable instruction. The stylus (104) is equipped with a sensor (101) for detecting actions on the virtual interactive screen. The processor (103) analyzes the movement patterns of the 's hand based on these actions, enabling a dynamic and intuitive interaction experience. By mapping the movement patterns to a pre-defined list of contents, the processor (103) determines the appropriate content to be projected on the surface.

[00012]. The present invention aspires to be scalable and adaptable to various screen sizes and surfaces. Whether it is a small interactive whiteboard or a large-scale projection on a wall, the system can adjust and optimize the projection and interaction capabilities accordingly, ensuring versatility and flexibility in different environments.

[00013]. The present invention aims to facilitate collaborative work by providing a system that enables multiple s to interact with the virtual interactive screen simultaneously. This promotes teamwork and facilitates real-time collaboration in various settings, such as classrooms, boardrooms, and design studios.

[00014]. The objective of the present invention is to allows continuous projection of one or more contents on the surface, enabling s to engage with the virtual interactive screen in real-time. The surface can be a wall in a room, providing a large canvas for interaction and collaboration. Importantly, the stylus (104) operates without a battery and can be integrated with a supercapacitor, ensuring uninterrupted usage and eliminating the need for frequent battery replacements.

[00015]. The objective of the present invention is to detecting actions, the system recognizes gestures made by the based on movement patterns. This recognition enables digital text writing on the surface, promoting efficient content creation and manipulation. Moreover, the system can analyze free handwritten patterns and automatically correct them if necessary, generating auto-corrected content for projection on the surface.

[00016]. The objective of the present invention is to enhance the experience, the system can generate audio output based on the projected contents, providing additional context or feedback during interaction. The system's synchronization with the controller ensures seamless processing, preventing product skipping and ensuring complete processing.

[00017]. Overall, the present invention revolutionizes the field of wireless interaction with virtual interactive screens by offering a system that addresses the limitations of conventional techniques. It provides an intuitive, flexible, and efficient interaction experience, empowering s to engage with virtual content in a natural and seamless manner.
[00018]. Other objects, advantages, and novel features of the invention will become apparent from the following detailed description of the invention when considered in conjunction with the accompanying drawings.


BRIEF DESCRIPTION OF DRAWINGS
[00019]. To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the disclosure and are therefore not to be considered limiting of its scope. The disclosure will be described and explained with additional specificity and detail with the accompanying drawings.
[00020]. The subject matter that is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other aspects, features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
[00021]. Figure 1 illustrates a System Structure and Components as according to the present invention;
[00022]. Figure 2 illustrates a System Environment and Surface Interaction as according to the present invention;
[00023]. Fig. 3 illustrates Step-by-Step Process Flow diagram, in accordance with an embodiment of the present invention, according to an embodiment of the present invention;
[00024]. Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.


Detailed Description of the Invention
[00025]. For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings, and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein would be contemplated as would normally occur to one skilled in the art to which the invention relates. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skilled in the art. The system, methods, and examples provided herein are illustrative only and are not intended to be limiting.
[00026]. The term “some” as used herein is to be understood as “none or one or more than one or all.” Accordingly, the terms “none,” “one,” “more than one,” “more than one, but not all” or “all” would all fall under the definition of “some.” The term “some embodiments” may refer to no embodiments or to one embodiment or to several embodiments or to all embodiments, without departing from the scope of the present disclosure.
[00027]. The terminology and structure employed herein is for describing, teaching, and illuminating some embodiments and their specific features. It does not in any way limit, restrict or reduce the spirit and scope of the claims or their equivalents.
[00028]. More specifically, any terms used herein such as but not limited to “includes,” “comprises,” “has,” “consists,” and grammatical variants thereof do not specify an exact limitation or restriction and certainly do not exclude the possible addition of one or more features or elements, unless otherwise stated, and furthermore must not be taken to exclude the possible removal of one or more of the listed features and elements, unless otherwise stated with the limiting language “must comprise” or “needs to include.”

[00029]. Whether or not a certain feature or element was limited to being used only once, either way, it may still be referred to as “one or more features” or “one or more elements” or “at least one feature” or “at least one element.” Furthermore, the use of the terms “one or more” or “at least one” feature or element do not preclude there being none of that feature or element, unless otherwise specified by limiting language such as “there needs to be one or more . . . ” or “one or more element is required.”
[00030]. Unless otherwise defined, all terms, and especially any technical and/or scientific terms, used herein may be taken to have the same meaning as commonly understood by one having ordinary skills in the art.
[00031]. Reference is made herein to some “embodiments.” It should be understood that an embodiment is an example of a possible implementation of any features and/or elements presented in the attached claims. Some embodiments have been described for the purpose of illuminating one or more of the potential ways in which the specific features and/or elements of the attached claims fulfill the requirements of uniqueness, utility and non-obviousness.
[00032]. Use of the phrases and/or terms including, but not limited to, “a first embodiment,” “a further embodiment,” “an alternate embodiment,” “one embodiment,” “an embodiment,” “multiple embodiments,” “some embodiments,” “other embodiments,” “further embodiment”, “furthermore embodiment”, “additional embodiment” or variants thereof do not necessarily refer to the same embodiments. Unless otherwise specified, one or more particular features and/or elements described in connection with one or more embodiments may be found in one embodiment, or may be found in more than one embodiment, or may be found in all embodiments, or may be found in no embodiments. Although one or more features and/or elements may be described herein in the context of only a single embodiment, or alternatively in the context of more than one embodiment, or further alternatively in the context of all embodiments, the features and/or elements may instead be provided separately or in any appropriate combination or not at all. Conversely, any features and/or elements described in the context of separate embodiments may alternatively be realized as existing together in the context of a single embodiment.
[00033]. Any particular and all details set forth herein are used in the context of some embodiments and therefore should not be necessarily taken as limiting factors to the attached claims. The attached claims and their legal equivalents can be realized in the context of embodiments other than the ones used as illustrative examples in the description below. Embodiments of the present invention will be described below in detail with reference to the accompanying drawings.
[00034]. The present invention is a revolutionary system that enables wireless interaction with virtual interactive screens, revolutionizing the way s engage with digital content. The system comprises several key components that work together seamlessly. Firstly, a stylus (104) equipped with sensor (101) s detects actions on the virtual interactive screen, such as gestures, movements, and actions for teaching purposes. These actions are then analysed by a processor (103) integrated into the stylus (104), which determines the movement patterns of the 's hand. Based on this analysis, the processor (103) maps the movement patterns to a predefined list of contents, selecting one or more appropriate contents to be projected on the surface. The stylus (104) also includes a projecting device (105) that continuously projects the selected contents onto the surface.
[00035]. To ensure a smooth and efficient experience, the present invention incorporates various technological advancements. The stylus (104) is designed to operate without a battery, utilizing a supercapacitor for power storage. This eliminates the need for frequent battery replacements or recharging, enhancing the convenience and usability of the system. Additionally, the system incorporates gesture recognition capabilities, enabling s to perform gestures that trigger specific actions, such as enabling digital text writing on the surface. The processor (103) can also recognize free handwritten patterns and determine whether auto-correction is required. If necessary, the processor (103) generates auto-corrected content to be projected on the surface, ensuring accuracy and legibility.
[00036]. In Figure 1, we can see an illustration of the structure and components of the stylus (104) system for teaching and interacting with a virtual interactive screen on a surface. The main component of the system is the stylus (104) device, which is depicted in the figure. The stylus (104) device is designed with a sleek and ergonomic form factor, ensuring comfortable handling during use. It is equipped with wireless capabilities, allowing seamless communication with the virtual interactive screen.

[00037]. The figure also highlights the presence of a sensor (101) within the stylus (104) device (104). The sensor (101) plays a crucial role in detecting the 's actions on the virtual screen. It can capture various gestures, movements, and interactions performed by the using the stylus (104) on the surface. These actions are then transmitted to the processor (103) for further analysis.
[00038]. The processor (103) another important component depicted in the figure, is responsible for analyzing the movement patterns of the 's hand based on the detected actions. It employs sophisticated algorithms to interpret the 's input accurately. By mapping the movement patterns to a pre-defined list of contents, the processor (103) determines the appropriate contents to be projected onto the surface.

[00039]. Furthermore, the figure showcases the projecting device (105) integrated into the stylus (104). This projecting device (105) is responsible for continuously projecting the selected contents onto the surface. It ensures that the projected contents remain synchronized with the 's actions, providing a seamless and interactive teaching experience.

[00040]. Overall, Figure 1 provides a comprehensive visual representation of the structure and components of the stylus (104) system. It emphasizes the compact and portable nature of the stylus (104) device and highlights the integration of essential components like the sensor (101), processor (103), and projecting device (105). This figure helps in understanding the overall architecture of the system and how the components work together to enable efficient teaching and interaction with the virtual interactive screen on a surface.

[00041]. In one embodiment of the present invention, a stylus (104) system is provided for teaching and interacting with a virtual interactive screen on a surface. The system comprises a stylus (104) device, a sensor (101) , a processor (103), and a projecting device. The stylus (104) device is designed with wireless capabilities, allowing seamless interaction with the virtual screen. The sensor (101) , integrated into the stylus (104), detects the 's actions on the virtual screen, such as drawing, writing, and gesturing. The processor (103) analyzes the movement patterns of the 's hand based on the detected actions, enabling precise recognition and interpretation. Using a pre-defined list of contents, the processor (103) determines the appropriate contents to be projected onto the surface, enhancing the teaching experience. The projecting device, also integrated into the stylus (104), continuously projects the selected contents onto the surface, creating an immersive and interactive learning environment.

[00042]. In Figure 2, we can see an illustration of the system environment and the interaction between the and the virtual interactive screen on a surface. The figure depicts a room with a wall acting as the surface for projection. This setup provides a large and easily accessible interactive screen for teaching and learning purposes.

[00043]. The figure highlights the wireless connection between the stylus (104) device and the virtual screen. It signifies that the stylus (104) communicates seamlessly with the virtual screen, allowing the to interact with the projected contents without the need for physical wires or cables. This wireless capability enhances mobility and flexibility, enabling the to move around freely while engaging with the virtual screen.

[00044]. Moreover, the figure showcases a actively using the stylus (104) to perform various actions on the virtual screen. These actions include drawing, writing, and gesturing. The stylus (104) captures the 's movements and transmits them to the system for analysis. The system, as described earlier, maps the movement patterns to specific contents and projects them onto the surface in real-time. This intuitive and natural interaction between the and the virtual screen enhances the teaching and learning experience, fostering creativity and engagement.
[00045].
[00046]. Overall, Figure 2 provides an overview of the system environment and the surface interaction in the context of teaching and interacting with a virtual interactive screen. It emphasizes the wireless connectivity, the use of the stylus (104) as the primary input device, and the intuitive nature of the interaction. This figure helps visualize how the system operates in a real-world setting and how s can effectively engage with the virtual screen to facilitate teaching and learning activities.
[00047]. In another embodiment, the stylus (104) system is designed to operate in a room-based environment. The system utilizes the surface of a wall as the canvas for projection. The can easily interact with the virtual screen by simply using the stylus (104) on the surface. The wireless connection between the stylus (104) device and the virtual screen enables freedom of movement and flexibility in teaching scenarios. The intuitive and natural interaction between the and the virtual screen enhances the learning experience, allowing for fluid drawing, writing, and gesturing. The stylus (104) system brings the virtual screen to life in the physical environment, creating a dynamic and engaging teaching platform.
[00048]. The step-by-step process flow depicted in Figure 3 illustrates the detailed sequence of operations involved in the stylus (104) system for teaching and interacting with the virtual interactive screen. It provides a comprehensive overview of the actions and functions performed by the system at each stage of its operation.
[00049]. Step 1: User Action Detection:
[00050].
[00051]. The first step in the process flow involves the detection of the 's action on the virtual interactive screen. The stylus (104) device is equipped with a sensor (101) that is specifically designed to capture and identify these actions performed by the . The sensor (101) can utilize various technologies such as touch, pressure, or motion sensing to detect the 's inputs.
[00052].
[00053]. When the interacts with the virtual interactive screen using the stylus (104), the sensor (101) within the stylus (104) device registers the corresponding movements and inputs. For example, if the is drawing a line on the screen, the sensor (101) detects the movement of the stylus (104) and captures the trajectory of the line being drawn.
[00054].
[00055]. The sensor (101) 's primary function is to accurately detect and capture the 's actions, regardless of whether they involve drawing, writing, or gesturing. By effectively capturing these inputs, the sensor (101) provides the necessary data for the subsequent stages of the process flow, enabling the system to analyze and respond to the 's actions appropriately.
[00056].
[00057]. Overall, the action detection step is crucial for the stylus (104) system as it establishes the foundation for the system's interaction with the virtual interactive screen. By accurately capturing and identifying the 's actions, the system can proceed to analyze the movement patterns, select appropriate content, and enable seamless interaction between the and the virtual screen.
[00058]. Step 2: Movement Pattern Analysis:
[00059].
[00060]. After the 's action is detected in step 1, the stylus (104) system's processor (103) initiates the analysis of the movement patterns of the 's hand. This step involves examining various aspects of the hand movements, such as direction, speed, acceleration, and other relevant characteristics.
[00061].
[00062]. The processor (103) utilizes advanced algorithms and pattern recognition techniques to analyze the captured movement data. By analyzing the movement patterns, the processor (103) gains valuable insights into the 's intended action and purpose behind the input. For example, it can determine whether the is drawing a straight line, making a curved stroke, or performing a specific gesture.
[00063].
[00064]. The analysis of movement patterns enables the system to understand and interpret the 's inputs more accurately. It helps in distinguishing between different actions and provides a basis for mapping these patterns to specific contents or functionalities of the system.
[00065].
[00066]. By gaining insight into the movement patterns of the 's hand, the processor (103) enhances the system's ability to provide an intuitive and responsive experience. It allows the system to respond appropriately to the 's inputs, such as projecting relevant content or enabling specific features based on the analyzed patterns.
[00067].
[00068]. Overall, the movement pattern analysis step plays a crucial role in enhancing the accuracy and effectiveness of the stylus (104) system. By understanding the 's intended actions through the analysis of movement patterns, the system can provide a more tailored and seamless interaction with the virtual interactive screen.
[00069]. Step 3: Content Selection:
[00070].
[00071]. Following the analysis of the movement patterns in step 2, the stylus (104) system's processor (103) proceeds to determine the appropriate content to be projected onto the surface. This step involves mapping the analyzed movement patterns to a pre-defined list of contents or functionalities within the system.
[00072].
[00073]. The processor (103) utilizes a mapping algorithm or rule-based system to match the 's detected actions with the most relevant content. By considering the characteristics of the movement patterns and comparing them to the predefined mappings, the system intelligently identifies the content that aligns with the 's intended action.
[00074].
[00075]. For example, if the 's movement patterns indicate a drawing gesture, the system may select a drawing tool or display a blank canvas for the to draw on. Similarly, if the movement patterns resemble handwriting, the system may activate a text input mode or recognize the 's handwriting for digital text writing.
[00076].
[00077]. By selecting the appropriate content based on the analyzed movement patterns, the system enhances the teaching and interacting experience for the . It ensures that the projected content is relevant and aligned with the 's intended actions, providing a seamless and intuitive interaction with the virtual interactive screen.
[00078].
[00079]. The content selection step is crucial in personalizing the system's response to the 's inputs. It enables the system to adapt to different teaching scenarios and cater to the specific needs of the , enhancing engagement and effectiveness in the teaching process.
[00080].
[00081]. Overall, the content selection step plays a vital role in tailoring the system's response based on the 's actions. By intelligently matching the movement patterns to relevant content, the system ensures a more meaningful and context-aware interaction with the virtual interactive screen.
[00082]. Step 4: Continuous Projection:
[00083].
[00084]. Once the content selection is made in step 3, the stylus (104) system's integrated projecting device (105) takes over the task of continuously projecting the chosen content onto the surface. This step ensures that the selected content remains visible and accessible to the throughout their interaction with the virtual interactive screen.
[00085].
[00086]. The projecting device (105) utilizes advanced projection technology, such as laser or LED-based systems, to accurately display the selected content onto the surface. It maintains a stable and consistent projection, providing a clear and vibrant representation of the content for the to engage with.

[00087]. By continuously projecting the chosen content, the system enables the to have real-time visual feedback of their actions on the virtual screen. Whether it's drawing, writing, or interacting with other digital elements, the projected content allows the to observe the results of their inputs and make immediate adjustments if needed.

[00088]. The continuous projection also ensures that the can easily reference and interact with the content over an extended period. Whether it's a presentation, instructional material, or collaborative work, the projected content remains visible and accessible, enhancing the overall teaching and interaction experience.

[00089]. This step highlights the importance of seamless and uninterrupted projection, allowing the to maintain focus and engagement without any disruptions. The continuous projection ensures that the selected content is readily available, visible, and dynamically updated as the continues to interact with the virtual interactive screen.

[00090]. Overall, the continuous projection step ensures that the can seamlessly interact with and observe the selected content, providing them with an immersive and effective teaching and learning experience.
[00091]. In addition to the core functionalities of action detection, movement pattern analysis, content selection, and continuous projection, the stylus (104) system incorporates several additional features to enrich the 's experience and facilitate efficient teaching and interaction.

[00092]. Gesture Recognition:
[00093]. The system is equipped with gesture recognition capabilities, which allow it to identify specific gestures performed by the . These gestures can include actions such as swiping, pinching, or tapping on the virtual interactive screen. The system accurately detects and interprets these gestures, enabling intuitive and efficient navigation and interaction with the content.

[00094]. Digital Text Writing:
[00095]. Another feature of the system is the ability to enable digital text writing on the surface. This feature allows the to write or input text directly onto the virtual interactive screen using the stylus (104). The system accurately captures and converts the handwritten text into digital format, providing a convenient and seamless way for the to incorporate written information into their teaching materials or annotations.

[00096]. Auto-Correction:
[00097]. To enhance the accuracy and legibility of handwritten patterns, the system incorporates auto-correction capabilities. It analyzes the 's freehand patterns and determines if any auto-correction is necessary. If required, the system generates an auto-corrected version of the content to be projected on the surface. This ensures that the content remains clear and easily understandable, even if the 's handwriting may be less precise.

[00098]. In yet another embodiment, the stylus (104) system incorporates a step-by-step process flow for seamless operation. The process begins with the sensor (101) in the stylus (104) detecting the 's actions on the virtual screen. These actions are then analyzed by the processor (103) to determine the movement patterns of the 's hand. Based on the analyzed patterns, the processor (103) selects the appropriate contents to be projected onto the surface. The system includes additional features such as gesture recognition, enabling the to perform specific commands or actions. Furthermore, digital text writing is enabled based on the 's gestures, allowing for annotations and notes on the projected contents. The processor (103) can also recognize free handwritten patterns and, if necessary, generate auto-corrected content for projection. The system's functionality is further enhanced by generating audio output based on the projected contents, providing additional sensor (101) y feedback.

[00099]. The stylus (104) system for teaching and interacting with a virtual interactive screen on a surface offers several advantages over conventional methods. Here are some of the key advantages:

[000100]. Enhanced Teaching Experience: The system provides a seamless and intuitive interaction between the and the virtual interactive screen. It allows the to perform various actions such as drawing, writing, and gesturing, facilitating effective communication and engagement in a teaching environment. The system's accurate detection of actions and continuous projection of selected content ensures a dynamic and immersive teaching experience.

[000101]. Precise Movement Pattern Analysis: By analysing the movement patterns of the 's hand, the system can accurately determine the intended action and purpose. This enables precise content selection, ensuring that the projected content aligns with the 's actions and enhances the teaching process. The system's ability to map movement patterns to a pre-defined list of contents ensures that the most relevant and appropriate content is presented.

[000102]. Versatile and Portable: The stylus (104) system is designed to be compact, portable, and wireless, making it highly versatile and easy to use in various teaching environments. The stylus (104) device itself is ergonomically designed, providing comfort and ease of use for the . Its wireless capabilities enable freedom of movement and eliminate the need for cumbersome wires or connections, enhancing the 's flexibility during teaching sessions.

[000103]. Additional Features for Enhanced Functionality: The system incorporates additional features such as gesture recognition, digital text writing, and auto-correction capabilities. These features expand the system's functionality, allowing for intuitive navigation, convenient input of text, and improved legibility of handwritten patterns. They enhance the 's ability to interact with the content and facilitate efficient content creation and annotation.

[000104]. Improved Clarity and Accessibility: The continuous projection of the selected content onto the surface ensures its visibility and accessibility to both the and the audience. This promotes clear communication and information sharing, facilitating effective teaching and learning. The system's auto-correction capabilities further enhance the clarity of presented information, ensuring that handwritten patterns are easily understood by all.

[000105]. Overall, the stylus (104) system for teaching and interacting with a virtual interactive screen on a surface offers advantages in terms of enhanced teaching experience, precise movement pattern analysis, versatility, additional features for enhanced functionality, and improved clarity and accessibility. It revolutionizes the way teaching is conducted, providing a dynamic and interactive platform that facilitates effective communication, engagement, and knowledge transfer.

[000106]. The figures and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of the embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible.
, Claims:We Claim:
1. A method for wirelessly interacting with virtual interactive screen on a surface, the method being performed by a stylus, the method comprising:
detecting, by a sensor associated with the stylus, an action of a using the virtual interactive screen for teaching purpose;
determining, by a processor associated with the stylus, a movement pattern of ’s hand based on the action of the ;
determining, by the processor, one or more contents to be projected on the surface based on an analysis of the movement pattern of the ’s hand, wherein the one or more contents are determined by mapping of the movement pattern of the ’s hand with a pre-defined list of contents; and
continuously projecting, by a projecting device associated with the stylus, the one or more contents on the surface, wherein the surface is a wall of a room.

2. The method according to claim 1, wherein the stylus is able to operate without a battery.

3. The method according to claim 1, wherein the stylus is integrated with a supercapacitor.

4. The method according to claim 1, further comprising:
detecting a gesture of the based on the movement pattern of the ;
enabling digital text writing on the surface based on the gesture of the ;
recognizing a free handwritten patterns based on the movement pattern of the ;
determining whether an auto-correction of the free handwritten pattern is required; and
upon the determination that the auto-correction of the free handwritten pattern is required, generating an auto-corrected content to be projected on the surface.

5. The method according to claim 1, further comprising:
generating an audio output based on the one or more contents projected on the surface.

6. A system for wirelessly interacting with virtual interactive screen on a surface, the system comprising:
a stylus;
a processor; and
a computer-readable medium communicatively coupled to the processor, wherein the computer-readable medium stores processor-executable instructions, which when executed by the processor, cause the processor to:
detect an action of a using the virtual interactive screen for teaching purpose;
determine a movement pattern of ’s hand based on the action of the ;
determine one or more contents to be projected on the surface based on an analysis of the movement pattern of the ’s hand, wherein the one or more contents are determined by mapping of the movement pattern of the ’s hand with a pre-defined list of contents; and
continuously project the one or more contents on the surface, wherein the surface is a wall of a room.

7. The system according to claim 8, wherein the stylus is able to operate without a battery.

8. The system according to claim 8, wherein the stylus is integrated with a supercapacitor.

9. The system according to claim 8, wherein the processor is further configured to:
detect a gesture of the based on the movement pattern of the ;
enable digital text writing on the surface based on the gesture of the .
recognizing a free handwritten patterns based on the movement pattern of the ;
determining whether an auto-correction of the free handwritten pattern is required; and
upon the determination that the auto-correction of the free handwritten pattern is required, generating an auto-corrected content to be projected on the surface.

10. The system according to claim 8, wherein the processor is further configured to:
generate an audio output based on the one or more contents projected on the surface.
Dated this on 19th day of July 2023

Ajay Kaushik
Agent for the Applicant [IN/PA-2159]
AKSH IP ASSOCIATES

Documents

Application Documents

# Name Date
1 202311048596-STATEMENT OF UNDERTAKING (FORM 3) [19-07-2023(online)].pdf 2023-07-19
2 202311048596-POWER OF AUTHORITY [19-07-2023(online)].pdf 2023-07-19
3 202311048596-FORM 1 [19-07-2023(online)].pdf 2023-07-19
4 202311048596-DRAWINGS [19-07-2023(online)].pdf 2023-07-19
5 202311048596-DECLARATION OF INVENTORSHIP (FORM 5) [19-07-2023(online)].pdf 2023-07-19
6 202311048596-COMPLETE SPECIFICATION [19-07-2023(online)].pdf 2023-07-19
7 202311048596-FORM-9 [13-10-2023(online)].pdf 2023-10-13
8 202311048596-FORM 18 [13-10-2023(online)].pdf 2023-10-13
9 202311048596-Proof of Right [17-10-2023(online)].pdf 2023-10-17
10 202311048596-FER.pdf 2025-04-15
11 202311048596-OTHERS [29-08-2025(online)].pdf 2025-08-29
12 202311048596-FORM-26 [29-08-2025(online)].pdf 2025-08-29
14 202311048596-DRAWING [29-08-2025(online)].pdf 2025-08-29
15 202311048596-CLAIMS [29-08-2025(online)].pdf 2025-08-29

Search Strategy

1 202311048596_SearchStrategyNew_E_SearchStrategyMatrixE_30-01-2025.pdf