Sign In to Follow Application
View All Documents & Correspondence

Smart Glasses With Augmented Reality For Real Time Data Visualization

Abstract: ABSTRACT Smart Glasses with Augmented Reality for Real-Time Data Visualization The present disclosure relates to a strategy and system for real-time data visualization through smart glasses with augmented reality (AR) capability. The system includes a head-mounted display unit to project overlays visually in the user’s field of view, an edge-based microprocessor implementing a neuromorphic computing architecture to maximize design for ultra-low-latency processing of estimated delay to user, and an adaptive AR visualization engine (ARVE) to prioritize data processing priorities from multi-channel data streams. The system is also able to provide a streaming biosensing module for acquiring any physiological signal, such as ECG, PPG, or skin conductivity, so that the ability of the ARVE can classify the complexity of the visualization based on the user’s cognitive state. The system also implements a blockchain-enabled security module, which can encrypt and authenticate AR overlays with token-gated access and securely collaborate among multiple users, and a multimodal interaction interface that allows scrolling, previewing and accessing overlays through gestures, eye gaze, and voice. The method also includes acquiring multi-stream data through multiple sources, processing new data inputs as local data, generating adaptive overlays and projecting them through AR in real-time (at the time of projection). Collectively, the present disclosure re-conceptualizes normal AR devices into a rapidly adaptive, intelligent privacy-preserving visualization system.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
22 August 2025
Publication Number
37/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Narendra Pal Singh
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh
Kuldeep Verma
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh
Bhuvan Chandra Joshi
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh
Abhishek Katiyar
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh
Prashansa Singh yadav
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh
Ratnesh Kumar Pandey
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh

Inventors

1. Narendra Pal Singh
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh
2. Kuldeep Verma
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh
3. Bhuvan Chandra Joshi
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh
4. Abhishek Katiyar
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh
5. Prashansa Singh yadav
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh
6. Ratnesh Kumar Pandey
Assistant Professor, Department of Computer Science and Engineering Invertis University Bareilly, Uttar Pradesh

Specification

Description:TECHNICAL FIELD
[0001] The current disclosure relates to a system and a method to provide real-time data visualization through smart glasses that incorporate augmented reality (AR) technology and, more specifically, a smart, adaptable and context-aware AR framework using edge-based microprocessors, biosensing modules, securely-sharing blockchain technology, and multi-modal interaction, for supporting low-latency, privacy-preserving, and adaptive visualization across use cases, such as healthcare, industrial monitoring, defense operations, education, and enterprise management.
BACKGROUND
[0002] Smart glasses with augmented reality (AR) capability are becoming a major part of the wearable technology and immersive computing categories that can augment and support human interaction with digital data. This includes the authentication via login/ID, the visualization of contextual information, and projection of augmented reality information directly within the user's field of view. Since almost every industry now relies on real-time decision-making, these industries are becoming increasingly reliant on augmented reality smart glasses to visualize complex data streams in a seamless, yet intuitive manner. Therefore, AR enabled smart glasses that enhance the user engagement with the digital world have never been more important.
[0003] Existing AR smart glasses technology available as of today, are based on conventional methods of passive projection of pre-processed information from companion devices or external servers. Some of challenges of the existing systems include high latency, dependence on cloud connectivity, limited data security, limited personalization, and more importantly the inability to customize the visualization of the information based on the users state or context. Together, these limitations present challenges against effective use in the safety-critical environments such as surgery, military operations, or industrial automation, where immediate, secure, and customizable visualization of data is essential.
[0004] Therefore, there is a clear need for a system and method for providing real-time, responsive, and secure augmented reality-based data visualization through smart glasses that incorporates edge-level neuromorphic processing for ultra-low latency, context-aware visualization through biosensing modules, blockchain mechanisms for privacy and authentication, and multimodal interaction interfaces for easy user control. Such a system can transform AR smart glasses from passive app/technology display devices into real-time, intelligent, personalized, responsive, and collaborative smart glasses.
SUMMARY
[0005] In one embodiment, a method for real-time data visualization using augmented reality smart glasses includes acquiring multi-source data from at least one of sensor devices, IoT devices, imaging devices, or cloud-based data storage; processing the acquired data locally on the smart glasses via an integrated edge computing microprocessor capable of neuromorphic computing functionalities to enable ultra-low latency; generating context aware augmented reality overlays using an adaptive visualization engine that incorporates artificial intelligence algorithms to prioritize and transform data; and accessing the augmented reality overlays directly within the user's view through a head mounted display. In one embodiment, the processor also incorporates a biosensing module that captures user physiological parameters such as EEG, PPG, or skin conductance and adapts the complexity of the visualization to mitigate cognitive load during stress, or heightened anxiety conditions. The processor also integrates a blockchain-enabled security framework for encrypting and authenticating real-time overlays using token-gated access control, and a multimodal interaction interface for utilizing gesture, voice, and gaze commands; and all modules interactively work together within the smart glasses to enable secure, adaptive, and efficient AR-based real-time visualization.
[0006] In one embodiment, a system for real-time data visualization using augmented reality smart glasses comprises a head-mounted display unit that is integrated to project augmented reality overlays for the user's field of view, an adaptive AR visualization engine that is integrated to process multi-source data streams and dynamically remap and prioritize real-time overlays according to context, and an edge computing microprocessor integrated into the augmented reality glasses that is equipped with neuromorphic computing capabilities to enable ultra-low latency, on-device data analysis of visual, audio and IoT sensor input. In one embodiment, the processor also incorporates a biosensing module that captures physiological signals such as EEG, PPG, or skin conductance to assess the user's cognitive state with the visualization engine dynamically altering the complexity of the overlay according to levels of stress or fatigue. The processor also integrates a blockchain-enabled security and token-gated access capability for encrypting and authenticating AR overlays in multi-user scenarios, as well as a multimodal interaction interface capable of receiving user commands via gesture, gaze, voice, or micro-expressions; and all modules are operatively coupled to turn the smart glasses from a passive display system to an intelligent, adaptive and collaborative AR-based visualization platform.
BRIEF DESCRIPTION OF DRAWINGS

[0007] FIG. 1 is a schematic representation of a system architecture for smart glasses with augmented reality with the intent of real time data visualization in accordance with an embodiment of the present invention.

[0008] FIG. 2 is a block diagram illustrating the functional modules of the smart glasses including an adaptive AR visualization engine, an edge based microprocessor that takes in multi-source data including biosensing, a security layer enabled by blockchain, and multimodal interaction capabilities with the user, in accordance with an embodiment of the present invention.

[0009] FIG. 3 is a flowchart (300) which illustrates a method to realize real-time data visualization in smart glasses with augmented reality. The flowchart depicts steps of multi-source data acquisition, processing through an edge based microprocessor, generating adaptive overlays, and projecting within the user's view, in accordance with an embodiment of the present invention.

DETAILED DESCRIPTION
[0010] The present disclosure addresses issues that inhibit conventional smart glasses, as these smart glasses are limited to passive data display; save for the essential minimal data processing needed to deliver the overlay to the user, smart glasses commonly rely heavily on processing data server-side, are not flexible for many augmented individualized experience requirements, and offer weak user data security. The system provides an integrated architecture with an edge-based microprocessor, enabling ultra-low latency data processing at edge, an adaptive AR visualization engine that allows context-driven experience to inform planning and priority of overlays, and a biosensing module for sensing the physiological states of the user that would dynamically allow for changes in the complexity of overlays through task-based modes of operations, while incorporating a blockchain-enabled security layer to provide tamper-resistant authentication and token-gated access where applied to allow for privacy informed collaborative experiences in multi-user contexts. This system provides a multimodal interaction interface that fuses gestures, gaze, and voice commands for intuitive user control. As now designed, the system inherently transforms smart glasses into an smart, adaptive and secure real-time AR visualization platform, providing substantial technical advancement beyond current systems.

[0011] The primary goal of this disclosure is to provide a system and method for the real-time visualization of data using smart glasses and integrated augmented reality to deliver adaptive, secure, and context-sensitive overlays to assist in better data decision making across domains. To this end, the overall objective of this disclosure is to include an edge-based microprocessor with neuromorphic design properties within the scope of data processing required locally and outside of external server-stored data for ultra-low latency delivery. In addition, we have the objective of further including a biosensing module within the overall system to monitor user cognitive and physiological states of being for the purpose of dynamically adapting visualizations complexity in real-time based on workload or stress conditions in effort to provide insightful, helpful customized experiences.The present disclosure describes the pursuit of a blockchain-enabled data security model for tamper-proof encryption, authentication, and token-gated access control of AR overlays in multi-user environments. The second item is collaborative AR synchronization through distributed machine learning, so that multiple users can share, confirm, and update overlays to work cooperatively and share data in real-time. Another item of the present disclosure is multimodal interaction interfaces that allow gesture, gaze, and voice commands to be fused seamlessly for intuitive hands-free operation. The above items mutually contribute to converting traditional smart glasses into a smart, dynamic, secure, and useable AR platform with a significant potential technical improvement over the prior art.

[0012] The present invention describes a system and method for real-time data visualization using smart glasses with augmented reality glasses that include an edge-based microprocessor based on neuromorphic computing technology to get ultra-low latency processing of data from multiple sources locally, a visualisation engine intended to adapt the data overlays and prioritise on and modifications using artificial intelligence, and a biosensing module intended to sense user physiological signals of EEG, PPG, or skin conductivity to reduce over-visualisation to make visualisation possible according to cognitive load or stress. Unlike conventional AR devices that only show passive data, with AR based only on client-side connections and cloud-based processing, the present invention generally sets out to introduce a blockchain-enabled security and token gated access model to support or offer tamper-proof encryption and authenticated sharing of AR overlays when working collaboratively.Moreover, the invention also describes a multimodal interface for interactive control of augmented reality that combines gestures, gaze, micro-expressions and voice commands, as well as a distributed machine learning sync module to simultaneously update augmented reality annotations across multiple users in real-time. The inventive contribution under this disclosure is the unique combination of biosensing driven adaptive visualization; on device neuromorphic edge processing and blockchain-based secured collaboration methods, that distinctly provide technical advantages in the form of latency reduction, contextual adaptability, and privacy assurances, which evolved smart glasses from passive display devices into intelligent, adaptive and secure augmented reality augmentations with real life industrial applications.

[0013] FIG.1 shows a schematic of a system architecture (100) for augmented reality smart glasses for real-time visualization of data, according to an embodiment of the present invention. The system (100) includes a head-mounted display unit (102) that projects augmented reality displays into a user’s field of vision. The head-mounted display unit (102) is optically coupled to a lens assembly (104) which provides additional clarity and depth to projected images. An image projection module (106), is functionally b,ed to the lens assembly (104), is responsible for rendering augmented overlays produced by the processing unit in the system.

[0014] The system (100) also includes an edge-based microprocessor (108) integrated within the housing of the smart glasses, wherein the microprocessor is configured for neuromorphic computing capabilities and to process data inputs with ultra-low latency.The microprocessor (108) is operatively connected to an adaptive AR visualization engine (110) that applies AI algorithms like reinforcement learning and context-based models to rank and convert incoming data streams from multiple sources into overlays relevant to the user's context. The system (100) provides an internal memory unit (112) for storing the instructions, visualization profiles, and temporary cache data for computing management within the user's device.

[0015] The system (100) includes a biosensing module (114) of electroencephalogram (EEG) electrodes, photoplethysmography (PPG) sensors, and skin conductance sensors, configured to determine the user's cognitive load, stress, or fatigue level. The data from the biosensing module (114) is provided to the microprocessor (108), which is used to map changes in the user's context to dynamically manage AR overlay density, complexity, and prominence. A security module (116) that uses blockchain technology is operatively connected to the microprocessor (108) and encrypts and authenticates AR overlays and executes token-gated access control to facilitate privacy and multi-user secure collaboration.

[0016] The system (100) includes a multimodal interaction interface (118) that is operatively connected to the microprocessor (108), and is configured to automatically detect user inputs in the form of gestures, gaze direction, micro-expressions, and voice commands. The multimodal interaction interface (118) applies sensor fusion in order to detect accurate relevant user inputs and facilitate control of overlays within hands-free environments. A wireless communication module (120) is sided with the multi-modal interaction interface (118) and microprocessor (108) to communicate with IoT devices, cloud repository, or other collaborative user smart glasses devices. The system (100) is powered by a rechargeable power supply unit (122) integrated within the frame of the smart glasses.As a group, these modules allow a system (100) to convert traditional AR glasses into an intelligent, adaptive, and secure platform for data visualization in real-time.

[0017] Together, the components of FIG. 1 work in tandem to achieve the novel and inventive features of the invention, where the head-worn display (102) and lens assembly (104) can provide augmented overlays that are derived with an image projection module (106), and the edge-based microprocessor (108) and adaptive AR visualization engine (110) are performing ultra-low latency on-device processing of multi-source inputs to create context-aware overlays. The biosensing module (114) dynamically adapts the visualization based on the users' cognitive and physiological state - which reduces cognitive load and usability. At the same time, the blockchain-enabled security module (116) provides tamper-proof of encryption of the data and token-gated access to ensure secure and authenticated multi-user collaboration - something not offered by traditional systems. The multimodal interaction interface (118) provides an additional method for intuitive and hands-free user control with gesture, gaze, and voice, while the wireless communication module (120) provides real-time interaction with IoT devices and co-users. The power supply unit (122) powers the modules which allow the smart glasses to convert to an intelligent, adaptive, and secure platform from a passive AR display and showcases a clear technical improvement via a reduction of latency, contextual adaptability, and privacy-preserving collaboration over the prior art which did not exist.

[0018] FIG. 2 shows a block diagram (200) of the components of the functional modules of the smart glasses to provide augmented reality to visualize real-time data in accordance with an embodiment of the invention.The system (200) consists of an input acquisition module (202) that accepts multi-source data from IoT devices, onboard sensors, imaging units, and external databases. The input acquisition module (202) is operatively coupled with a sensor fusion unit (204) to integrate raw input streams and ready them for processing, which includes assuring integrity and reducing noise.

[0019] Devices in the system (200) include an edge-based microprocessor module (206) that has neuromorphic processing capabilities, that analyze the data locally, and minimize latency. The microprocessor (206) will be operatively coupled with an adaptive AR visualization engine (208), which uses artificial intelligence algorithms, such as reinforcement learning and contextual language models, to prioritize, transform, and generate relevant AR overlays for the user. The microprocessor (206) will also be associated with a local memory unit (210) for storing temporary cache, AI models to compute the overlay, and visualization profiles to efficiently compute in real-time.
[0020] The block diagram (200) also includes a biosensing module (212) with sensors (for example, EEG electrodes, PPG detectors, skin conductance sensors) to establish the cognitive and physiological states of the user. The data from the biosensing module (212) can be sent to the adaptive AR visualization engine (208) through the microprocessor (206) for use to alter the visualization density and complexity which can be tailored to stress, fatigue, or workload. The microprocessor (206) is operatively coupled with a blockchain-enabled security module (214) that will be responsible for the encryption, authentication, and token-gated access of AR overlays in collaborative environments to share sensitive visual information in a privacy-preserving and tamper-proof manner.[0018] The system (200) also includes a multimodal interaction interface (216) to collect and interpret gesture, gaze, micro-expression, and voice from the user. The multimodal interaction interface (216) will utilize data from the adaptive AR visualization engine (208) to modify over-lays in response to user commands in real-time. The system also includes a wireless communication module (218) to interface with IoT devices, cloud repositories, or peer smart glasses to support distributed collaborative visualization. The entire system is powered by an embedded, rechargeable power supply unit (220) with the smart glasses, providing seamless portability. Collectively these functional modules in FIG. 2 emphasize how tightly integrated the architecture is as it carries out intelligent, adaptive, and secure real-time AR data visualization.

[0021] Collectively the components of FIG. 2 persevere together to realize the novel and inventive aspects of the present invention, where the input acquisition module (202) and sensor fusion unit 204) provide accurate and seamless acquisition of multi-source data that is processed by the edge microprocessor (206) and adaptive AR visualization engine (208) to produce low-latency and context-aware overlays. The introduction of a biosensing module (212) allows for user-centric adaptability because the AR visualization complexity can be adjusted according to real-time physiological states that reduce cognitive load and the usability. Finally, even traditional AR will uniquely contribute to the inventive step of the present invention with the use of the blockchain-enabled security module (214) which provides tamper-proof encryption and authentication as well as token-gated access for secure collaboration, a complicated issue not typically solved with AR systems.The multimodal interaction interface (216) enables natural, hands-free interaction with the wearable AR smart glasses platform for real-time data visualization. The wireless communication module (218) allows for synchronization with co-users and internet-enabled smart, IoT, and AI devices. This allows the power supply unit (220) and other modules to form a system that facilitates standardization into a personalized, adaptive, and privacy-disruptive AR platform. This represents a significant technological advancement for real-time data visualization, secure collaboration, and adaptive experience compared to past developments in augmented reality devices.

[0022] In a based operation, a system for real-time data visualization through smart glasses with augmented reality facilitates the capture, processing, and developing & projecting of the multi-source information for visualization into the user's field of view to support their current task and situational awareness. The system consists of a head-mounted display (HMD) device with a lens assembly (104) and projection module (106) to visualize the computational overlay. The system further includes a specialized edge device with a microprocessor (108) that works with an adaptive AR visualization engine (110) and together perform on-device neuromorphic computed visualizations for adaptive context-aware and real-time visualizations with ultra-low latency or nearly real-time. The system includes a biosensing module (114) for capturing physiological signals to represent more information about human performance and adaptively affect the overlay such as EEG, PPG or skin conductance, to deliver state predictions augmented with the current data on-screen. The system further includes a blockchain-enabled security unit (116) to encrypt and authenticate the overlays, whereby the overlay can be token-gated, to further refine user access control. The system, in an embodiment, further includes a multimodal interaction interface (118) to provide for the fusing of gestures from the user, eye-gaze behavior as well as micro-expressions and voice, to allow natural, intuitive, hands-free operation. The system, in an embodiment, further includes a wireless communication module (120) to allow for the wearer's synchronized access with other hardware IoT devices, cloud, other synchronized users, and users' environment as it changes with passing time. The system, in an embodiment, includes a rechargeable power supply unit (122) to provide the capability of off-the-grid and portable access to all model hardware.In one example, the adaptive augmented reality (AR) visualization engine (110) continuously adapts the complexity and density of overlays based on a user's biosensing data to ameliorate cognitive load while providing privacy-preserving multi-user collaboration with the blockchain module (116). Consequently, the system provides an intelligent, adaptive, and secure AR-based visualization system with novelty, inventive combination, and technical enhancement over traditional AR devices.

[0023] In one example, the processor is configured process neuromorphic computations for real-time analysis of combined data from multiple sources, thereby facilitating seamless AR visualizations without requiring an external server and reducing latency in the visualizations. In one example, the processor is configured to work with the adaptive AR visualization engine to dynamically scale the user overlay using reinforcement learning and contextual artificial intelligence models. In one example, the processor is configured to work with a biosensing module and receive physiological signals like: EEG, PPG, and skin conductance, and adapt or change the user's visualization density, brightness, and/or complexity based on the user's cognitive load or stress. In one example, the processor is configured with a blockchain security layer to encrypt overlays, authenticate data exchanges, and provides token gated access for authorized users in collaborative AR environments. In one example, the processor is configured to enable multimodal interaction by interpreting gesture recognition, gaze tracking, voice commands, or micro-expressions inputs and multimodal sensing information such as sensor fusion for intuitive user control. In one example, the processor is configured with a distributed machine learning framework to federated training and synchronization of AR annotations for multiple user AR experiences in industrial, healthcare, or defence contexts.
[0024] In another embodiment of the current invention, the system for real-time data visualization using smart glasses with augmented reality includes a collaborative AR synchronization module configured to allow multiple users wearing the smart glasses to access, annotate, and update augmented overlays at the same time and in real time, where the collaborative AR synchronization module uses distributed machine learning and federated training methods to receive and authorize updates from one user and securely propagate those updates for all other authorized users without centralized processing. The collaborative AR synchronization module works in combination with the blockchain-enabled security layer to authenticate users' identities and facilitate verified, tamper-proof sharing of overlays among devices. In addition to the augmented overlays of the collaborative AR synchronization module, the system can integrate a continuous flow of data streams from IoT-enabled machinery, medical instruments, or defense sensors, into the shared AR context, where every user receives overlays optimally grounded in context, based on their duties and position. This embodiment magnifies the group intelligence and situational awareness of teams engaged in high-impact settings, such as surgical suites, industrial automation, or battlefield missions, and represents a clear technological improvement over past AR and localization systems that operated as stand-alone applications.

[0025] In yet another embodiment of the current invention, the system for real-time data visualization using smart glasses with augmented reality includes an intelligent context aware energy management module configured to save energy in the glasses by choosing available processing resources based on task criticality and trust with respect to the user's state.The energy management module works alongside the edge-based microprocessor and the biosensing module so that during critical/high-stress events, the processor prioritizes overlays that are critical before background computation, and for low-load events, such as idle, enters a power-management state without interfering with the visualizations. The energy management module is also capable of predicting energy needs with machine learning algorithms to decide when to power on/off subsystems, such as the blockchain security layer, the multimodal interaction interface, and the wireless communication module, thus extending the operational lifetime of the battery. This embodiment maintains the capability for real-time operation given energy constraints and represents a level of novelty over conventional AR glasses that do not include adaptive energy optimization capabilities.

[0026] Let’s think of a practical example to highlight how the present disclosure may operate. Suppose a surgeon is performing a minimally invasive procedure with smart glasses with augmented reality; the input acquisition module (202) is receiving real-time patient vitals data from IoT-enabled medical devices and real-time imaging feeds data from the endoscopic cameras, which is processed locally through the edge-based microprocessor (206) in order to create ultra-low latency overlays. The adaptive AR visualization engine (208) highlights real-time critical parameters such as oxygen saturation and heart rate in the surgeon’s field of view but minimizes the interfaces of less critical data to minimize distraction. The biosensing module (212) is monitoring the surgeon’s stress level and could modify all actionable real-time overlays if physiological stress was above a threshold.Meanwhile, a blockchain-enabled security module (214) secures all the surgical data so it is encrypted and can only be used by authorized collaborators who are observing via remote connection. The multimodal interaction interface (216) will allow the surgeon to interface and change imaging modes using gaze and voice commands (thereby maintaining sterility) to complete their tasks. This specific scenario illustrates how the system provides enhanced precision, safety, and efficiency by providing adaptive, secure, and real-time augmented (AR) visualization in a vital healthcare ecosystem.

[0027] The FIG. 3 flowchart (300) illustrates a method for the system and method to provide real-time data visualization using smart glasses in augmented reality in a way that is pursuant to an embodiment of the present disclosure. The method will begin at a starting step (302) and proceed to a data acquisition step (304), in which multi-source data inputs from Internet-of-Things (IoT) devices, onboard sensor data, imaging units, and external databases were gathered through the input acquisition module (202). During a processing step (306), the input data is sent to the respective edge based microprocessor (206) and run through neuromorphic computations for an ultra-low latency analysis. In a visualization generation step (308), the adaptive AR visualization engine (208) then converts the processed data into prioritized augmented overlays followed by the production of artificial intelligence models. In a biosensing adaptation step (310), the visualization complexity is dynamically adjusted according to observed physiological signals transmitted from the biosensing module (212) based on the user’s stress or cognitive load. Also, the security and authentication step (312), the blockchain-enabled security module (214) encrypts and authenticates overlays and collects responses from the input commands number of authorized collaborators in a token-gated manner, and at a projection step (314), the overlays were be projected so they appear in the user’s field of vision in the head-mounted display unit (102). For a user interaction step (316), multimodal commands (for example, gesture, gaze, and voice) are received through the multimodal interaction interface (216), and based on the multimodal commands the overlays can be recalibrated in real-time. For a synchronization step (318), the AR overlays and annotations are safely shared with other authorized users using the wireless communication module (218). The method concludes with an end step (320) triggering the cycle of real-time AR-based visualization to repeat.

[0028] The current disclosure provides a range of technological advantages over existing smart glasses that operate solely as a passive display device with poor latency, poor adaptation, and poor security. First, the current disclosure suite uses a biosensing module (212) which implements a range of three types of sensors (EEG, PPG, and skin conductance) to actively modulate AR overlays based on the user's physiological and cognitive state thereby adapting the overlays and reducing cognitive loads, resulting in overall usability, and ultimately adaptability. This adaptability is capable because of the edge-based processor (206), which is card carrying, therefore, exhibiting neuromorphic processor functionality as represented above in the description, with no latency, possibly with no dependence on external server(s), while the prior art depends on low latencies occurring on external servers eg. azure, aws etc. Further the current disclosure facilitates tamper-proof encryption, authentication, and token-gated access control through the integrated blockchain-enabled security module (214), wherein both the data privacy issues as well as integrity issues were not clearly articulated to be solved, nor were they solved in prior art. Last, the current disclosure accommodates a multi-modal interaction interface (216) that includes gestures, gaze, voice and micro-expressions where a user can safely control their experience, hands-free, intuitively. At the same time the collaborative synchronization module, sample constructed across the entire delivery module provided real-time multi-party interaction experiences of overlaid interactions both on the Advisory level as well as distributed machine learning platform of all users interacting together sharing their overlays. Taken together, all of these technical contributions to new knowledge exhibit an incredible technical advancement, from an ease of use perspective and also from a level of experiential ability perspective, prominently due to the modifications made to the affective human-computer interaction experience, meaning adaptability, less latency, secure, collaborative activities, and user directed visualizations, forming the improvident step, over known solutions.

[0029] The current disclosure provides a concrete, and tangible technical solution for an extremely technical problem solvable for an augmented-reality-based device, namely the high latency coupled to real-time visualization, poor adaptive capability, poor data security, and certainly poor multi-user collaborative experience. The current disclosure provides specific technical features and functionalities include, an edge-based microprocess (206) with a neuromorphic computing architecture providing ultra-low latency on device processing; an adapted AR visualization engine (208) that is amending and prioritizing an adapted, altered overlay, through the form artificial intelligent engine based on the user; and a biosensing module (212) which tracks cognitive load, and stress, and recalibrations complexity of the sensory visualization based on that information. More specifically, according to the current invention, the system employed an encryption and authentication module based on blockchain, for the overlay by stream authentication on shared annotating by way of collaborative security module, thus enabling a collaborative multi-user and secure experience. Furthermore, it provided the user with symbols for intuitive, gesture based, gaze based, and voice based, and supplementary control through micro-expressions, through a multimodal interaction layer (216), for the user of the smart glasses to manage their experience, hands-free, intuitively and real-time collaborative synchronization module which incorporates a distributed machine learning platform in which to share overlays, and validate in highly effective real-time. As a whole, all of these technical features and functionalities, both hardware-software integrated features with the correct associated symbology for each, both consider and detail a movement of present day smart glasses from a passive display tool to an identified intelligent, adaptive, a privacy-oriented augmented and mixed power flexibly capable of real-time data visualizations, it is conducive to presenting a high resolution technical advancement as a solution to a real-world technical problem.

, Claims:CLAIMS
We Claim:
1. A system for real-time data visualization using smart glasses with augmented reality, the system comprising:
a)a head-mounted display unit configured to project augmented reality overlays in a user’s field of view;
b)an edge-based microprocessor configured with neuromorphic computing capabilities for on-device, ultra-low latency analysis of multi-source input data;
c)an adaptive AR visualization engine operatively coupled to the microprocessor, the engine configured to dynamically transform and prioritize overlays using artificial intelligence models;
d)a biosensing module comprising at least one of an electroencephalogram (EEG) sensor, a photoplethysmography (PPG) sensor, or a skin conductance sensor, configured to capture physiological signals of a user for adjusting visualization complexity;
e)a blockchain-enabled security module configured to encrypt overlays, authenticate data streams, and enable token-gated access control for secure multi-user collaboration; and
f)a multimodal interaction interface configured to receive user inputs via gestures, gaze, micro-expressions, or voice commands for interactive control of AR overlays.
2. The system as claimed in claim 1, wherein the adaptive AR visualization engine employs reinforcement learning and contextual artificial intelligence models to adjust visualization parameters based on real-time task criticality and user cognitive load.
3. The system as claimed in claim 1, wherein the blockchain-enabled security module comprises a smart contract framework configured to provide tamper-resistant authentication and secure sharing of AR overlays among authorized devices.
4. The system as claimed in claim 1, wherein the multimodal interaction interface employs sensor fusion algorithms to integrate gesture recognition, gaze tracking, and voice commands for hands-free control of overlays.

5. The system as claimed in claim 1, further comprising a collaborative synchronization module configured to employ distributed machine learning for real-time updating, validation, and sharing of AR overlays across multiple users.
6. A method for real-time data visualization using smart glasses with augmented reality, the method comprising:
a)acquiring multi-source input data from at least one of IoT devices, onboard sensors, imaging units, or external databases;
b)processing the input data locally using an edge-based microprocessor configured with neuromorphic computing capabilities;
c)generating augmented reality overlays using an adaptive AR visualization engine; and
projecting the overlays in a user’s field of view via a head-mounted display unit.
7. The method as claimed in claim 6, further comprising capturing physiological signals of the user through a biosensing module and dynamically modifying the visualization complexity in accordance with the user’s stress or cognitive load.
8. The method as claimed in claim 6, further comprising encrypting and authenticating the augmented reality overlays using a blockchain-enabled security module, wherein token-gated access control ensures privacy-preserving collaboration.
9. The method as claimed in claim 6, further comprising receiving multimodal user commands through a multimodal interaction interface configured to interpret gestures, gaze, micro-expressions, and voice inputs for modifying AR overlays in real time.
10. The method as claimed in claim 6, further comprising synchronizing AR overlays and annotations across multiple users by employing a collaborative synchronization module configured with distributed machine learning for federated updates.

Documents

Application Documents

# Name Date
1 202511079814-STATEMENT OF UNDERTAKING (FORM 3) [22-08-2025(online)].pdf 2025-08-22
2 202511079814-REQUEST FOR EARLY PUBLICATION(FORM-9) [22-08-2025(online)].pdf 2025-08-22
3 202511079814-POWER OF AUTHORITY [22-08-2025(online)].pdf 2025-08-22
4 202511079814-FORM-9 [22-08-2025(online)].pdf 2025-08-22
5 202511079814-FORM 1 [22-08-2025(online)].pdf 2025-08-22
6 202511079814-DRAWINGS [22-08-2025(online)].pdf 2025-08-22
7 202511079814-DECLARATION OF INVENTORSHIP (FORM 5) [22-08-2025(online)].pdf 2025-08-22
8 202511079814-COMPLETE SPECIFICATION [22-08-2025(online)].pdf 2025-08-22