Sign In to Follow Application
View All Documents & Correspondence

A System For Digital Marketer Augmented Reality Technology Digital Customer (Dard) Framework

Abstract: Disclosed herein is a system (100) for digital marketer-augmented reality technology-digital customer (DARD) framework. The system (100) comprising at least one smart device (102) collecting real-time data from a user. The system (100) comprising a plurality of light detection and ranging and depth sensor (106) configured to perform real-world three-dimensional (3D) spatial mapping for realistic augmented reality (AR) experiences. The system (100) comprising a processing unit (108) further comprising a data acquisition module (110) configured to capture real-time data from the smart devices (102). The system (100) also comprising a deep learning module (112) configured to analyze real-time user data. The system (100) comprising an augmented reality engine (114) operable for three-dimensional (3D) graphics rendering, image recognition, and adaptive user interface personalization. The system (100) comprising a blockchain-based smart contract module (116). The system (100) comprising a 5G-enabled edge computing infrastructure (118) and a neural optimization engine (120).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
05 May 2025
Publication Number
22/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR UNIVERSITY
ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Inventors

1. MD ABDUL WAHEB
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
2. DR N SUMAN KUMAR
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Specification

Description:FIELD OF DISCLOSURE
[0001] The present disclosure generally relates to a marketing system and specifically relates to an augmented reality (AR)-based digital marketing system.
BACKGROUND OF THE DISCLOSURE
[0002] In today's digital era, customer expectations for seamless, personalized, and responsive experiences are higher than ever. Businesses that fail to engage customers in real-time risk losing valuable opportunities to build loyalty, address concerns promptly, and drive conversions. Traditional customer engagement methods, such as email responses or static web forms, often result in missed chances to capture customer intent or sentiment at critical moments.
[0003] Moreover, many marketers lack the infrastructure to convert real-time interactions into actionable insights, limiting their ability to optimize user experiences dynamically. As competition within the marketing and advertisement sector, intensifies and data-driven decision-making becomes a business imperative, there is a pressing need for intelligent, real-time customer engagement solutions that not only interact with users but also generate immediate, meaningful insights to inform product development, marketing, and support strategies.
[0004] Many technological solutions, including augmented reality (AR) technology has emerged as a game changer across various sectors, with the potential to enhance user engagement. Presently, there are a few AR-based technological solutions aimed at enhancing continual user engagement but are riddled with many limitations. Microsoft HoloLens, is a cutting-edge mixed reality device, which is riddled faces significant challenges due to its high cost, bulky and specific hardware requirements, and limited accessibility. Thus, this solution is impractical for mass consumer engagement required for marketing campaigns.
[0005] Another commonly used solution is Snapchat AR Filters, while highly popular but is limited by platform dependence and short-term user engagement, which is unsuitable for mass marketing and buyer conversion. Another popular solution is IKEA Place, which is limited by inaccurate product representation that can lead to inaccurate expectations and potential dissatisfaction in consumers. Yet another solution is Sephora Virtual Artist, which is limited by its adaptability for other brands or cross-platform usage. Pokémon GO, is another one of the most successful AR-based solution, but is highly expensive and has limiting customization capabilities.
[0006] Currently, there is no unified model connects digital marketers, AR experiences, and customer insights and the existing AR-based solutions do not dynamically adjust content based on user behavior and biometric feedback. Additionally, the consumers hesitate to engage with many of the existing AR solutions due to unclear data usage policies. In particularly, the current AR-based marketing campaigns are static and they fail to learn from consumer interactions for marketing campaign optimization. Thus, in light of aforementioned limitations, there is a need for cost-effective, user-friendly, and flexible AR-based solution that can enhance consumer engagement in real-time in marketing and advertisement sector.
[0007] Thus, the disclosed invention provides an AR-based solution for real-time consumer engagement and marketing insight generation.
SUMMARY
[0008] The following is a summary description of illustrative embodiments of the invention. It is provided as a preface to assist those skilled in the art to more rapidly assimilate the detailed design discussion which ensues and is not intended in any way to limit the scope of the claims which are appended hereto in order to particularly point out the invention.
[0009] Embodiments in accordance with the present invention provide a system for digital marketer-augmented reality technology-digital customer (DARD) framework. Embodiments in accordance with the present invention further provide a computer-implemented method for delivering personalized and real-time augmented reality (AR) marketing content using a distributed augmented reality delivery (DARD) framework.
[0010] Embodiments of the present invention may provide a number of advantages depending on its particular configuration. First, embodiments of the present application provide a system for digital marketer-augmented reality technology-digital customer (DARD) framework. Next, embodiments of the present application provide a computer-implemented method for delivering personalized and real-time augmented reality (AR) marketing content using a distributed augmented reality delivery (DARD) framework.
[0011] The present disclosure solves all the major limitation of traditional system.
[0012] An objective of the present disclosure is to create highly interactive and personalized augmented reality (AR) experiences that capture consumer attention and foster meaningful interactions in real time and generate actionable insights.
[0013] Another objective of the present disclosure is to revolutionize customer engagement by integrating augmented reality (AR) into digital marketing strategies by bridging the gap between virtual content and real-world user interactions.
[0014] Another objective of the present disclosure is to offer immersive experiences that drive deeper engagement and actionable business insights.
[0015] Another objective of the present disclosure is to provide marketers with an intuitive platform for designing, deploying, and analyzing AR-based campaigns without requiring extensive technical expertise.
[0016] Another objective of the present disclosure is to generate actionable consumer insights by transforming raw data into insights for making informed marketing decisions.
[0017] Another objective of the present disclosure is to user satisfaction and strengthens brand loyalty through immersive consumer experience.
[0018] Yet another objective of the present disclosure is to assist digital marketers in identifying the factors that influence and drive the adoption of Augmented Reality (AR) technology.
[0019] Yet another objective of the present disclosure is to integrate AR technology to enhance real-time user experiences (UX), providing immersive and engaging interactions with consumers.
[0020] Yet another objective of the present disclosure is to gain valuable consumer insights, enabling them to better understand consumer behavior and preferences by leveraging AR technology.
[0021] Yet another objective of the present disclosure is to bridge the gap between digital marketers, AR technology, and digital customers, fostering innovation and enhanced customer engagement in marketing efforts.
[0022] In the light of above disclosure, in an aspect of the present disclosure, a system for digital marketer-augmented reality technology-digital customer (DARD) framework is disclosed herein. The system comprising at least one smart device collecting real-time data from a user. The smart device is integrated with a plurality of sensors. The smart device is augmented reality (AR)-enabled. The system also comprising a plurality of light detection and ranging and depth sensor operationally coupled to the smart device and the light detection and ranging and depth sensors configured to perform real-world three-dimensional (3D) spatial mapping for realistic augmented reality (AR) experiences. The system also comprising a processing unit operationally coupled to the smart device and light detection and ranging and depth sensors and the processing unit further comprising a data acquisition module executed on the processing unit and the data acquisition module configured to capture real-time data from the smart devices. The processing unit further comprising a deep learning module executed on the processing unit and operationally coupled to the deep learning module and the deep learning module configured to analyze real-time user data including gaze tracking, emotional response, electroencephalogram (EEG) signals, device movement, demographic data, and interaction history. The processing unit further comprising an augmented reality engine executed on the processing unit and operationally coupled to the light detection and ranging and depth sensors and the augmented reality engine operable for three-dimensional (3D) graphics rendering, image recognition, and adaptive user interface personalization. The processing unit further comprising a blockchain-based smart contract module executed on the processing unit and the blockchain-based smart contract module configured to securely store anonymized user interaction, and issue non-fungible token (NFT)-based loyalty or engagement rewards. The processing unit further comprising a 5G-enabled edge computing infrastructure operationally coupled to the deep learning module and the 5G-enabled edge computing infrastructure configured to enable high-speed processing of artificial intelligence (AI)- augmented realty (AR) interactions. The processing unit further comprising a neural optimization engine operationally coupled to the 5G-enabled edge computing infrastructure and the neural optimization engine configured to read and process subconscious user responses to augmented reality (AR) content, and dynamically optimizing advertising elements as per the user responses.
[0023] In one embodiment, the neural optimization engine adapts the advertising elements based on subconscious emotional responses, interaction data trends, and contextual environmental factors.
[0024] In one embodiment, the system also includes at least one marketer interface integrated with augmented reality (AR)-powered virtual shopping assistants and mixed reality (MR) shopping feature.
[0025] In one embodiment, the augmented reality (AR)-powered virtual shopping assistants and mixed reality (MR) shopping feature helps the user in visualizing and making purchasing decisions in real-time augmented reality (AR) stores.
[0026] In one embodiment, the smart device includes smart mirrors, wearables, and smart displays for personalized augmented reality (AR) experiences.
[0027] In one embodiment, the sensors include biometric sensors for capturing brainwave, heart rate, eye-tracking, and emotion data to optimize augmented reality (AR) advertisements.
[0028] In one embodiment, the light detection and ranging and depth sensor captures the real-time environment of the user.
[0029] In another aspect of the present disclosure a computer-implemented method for delivering personalized and real-time augmented reality (AR) marketing content using a distributed augmented reality delivery (DARD) framework is disclosed herein. The method comprising collecting real-time user data including gaze direction, emotional response, facial expressions, electroencephalogram (EEG) signals, heart rate, and device orientation via at least one smart device. The method also comprising analyzing the collected user data using a deep learning module executed on the processing unit to determine the user's attention level, emotional engagement, and behavioral patterns. The method also comprising rendering and displaying personalized augmented reality (AR) content via an augmented reality engine executed on the processing unit for spatial mapping, 3D model overlays, and adaptive interfaces, using the input provided by a plurality of light detection and ranging and depth sensor. The method also comprising dynamically adjusting augmented reality (AR) content via a 5G-enabled edge computing infrastructure based on the user feedback, environmental context, and subconscious responses inferred from the deep learning module output obtained from the augmented reality engine and the deep learning module. The method also comprising optimizing augmented reality (AR) marketing content using a neural optimization engine.
[0030] In one embodiment, the method further includes recording anonymized interaction data and user engagement metrics on a blockchain ledger using a blockchain-based smart contract module.
[0031] In one embodiment, the method further includes issuing non-fungible token (NFT)-based engagement rewards to the user using the blockchain-based smart contract module.
[0032] These and other advantages will be apparent from the present application of the embodiments and solves abovementioned limitations in the traditional system.
[0033] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
[0034] These elements, together with the other aspects of the present disclosure and various features are pointed out with particularity in the claims annexed hereto and form a part of the present disclosure. For a better understanding of the present disclosure, its operating advantages, and the specified object attained by its uses, reference should be made to the accompanying drawings and descriptive matter in which there are illustrated exemplary embodiments of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0035] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0036] FIG. 1 illustrates a block diagram of a system for digital marketer-augmented reality technology-digital customer (DARD) framework, according to an embodiment of the present invention;
[0037] FIG. 2 illustrates a flowchart of a computer-implemented method for delivering personalized and real-time augmented reality (AR) marketing content using a distributed augmented reality delivery (DARD) framework, according to an embodiment of the present invention; and
[0038] FIG. 3 illustrates an exemplary digital marketer-augmented reality technology-digital customer (DARD) framework, according to another embodiment of the present invention.
[0039] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0040] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0041] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0042] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0043] FIG. 1 illustrates a block diagram of a system 100 for digital marketer-augmented reality technology-digital customer (DARD) framework, according to an embodiment of the present invention.
[0044] The system 100 may be comprising at least one smart device 102, a plurality of light detection and ranging and depth sensor 106, a processing unit 108, a data acquisition module 110, a deep learning module 112, an augmented reality engine 114, a blockchain-based smart contract module 116, a 5G-enabled edge computing infrastructure 118, and a neural optimization engine 120.
[0045] The smart device 102 may be collecting real-time data from a user. The smart device 102 may be integrated with a plurality of sensors 104. The smart device 102 may be augmented reality (AR)-enabled.
[0046] The smart device 102 may include smart mirrors, wearables, and smart displays for personalized augmented reality (AR) experiences.
[0047] The sensors 104 may include biometric sensors for capturing brainwave, heart rate, eye-tracking, and emotion data to optimize augmented reality (AR) advertisements.
[0048] In an embodiment of the present disclosure, the smart device 102 may be integrated into retail environments or mobile platforms. The smart device 102 may be capable of providing adapted AR content in real-time based on the data collected from the user through the sensors 104. In an exemplary embodiment, the AR content and experiences may include virtual try-ons, interactive advertisements, guided shopping assistance, or emotion-based product recommendations to support both passive and interactive engagement modes depending on the user preferences and context. Embodiments of the present disclosure are intended to include or otherwise cover various smart devices such as but not limited to, electronic or smart processing units, prior art, or later developed devices.
[0049] In a preferred embodiment, the smart device 102 may be equipped with AR rendering capabilities and support for wireless communication protocols such as Bluetooth, Wi-Fi, or 5G for seamless data transmission for communicating bilaterally with the processing unit 108. Embodiments of the present disclosure are intended to include or otherwise cover various wireless and wired communication protocols such as, but not limited to, existing protocols, prior art, and later developed technologies.
[0050] In an embodiment of the present disclosure, the sensors 104 may be embedded within or connected to the smart device 102 for detecting and interpreting brainwave activity (via EEG), heart rate variability, eye movement, facial expressions, and other emotion-relevant signals. in an exemplary embodiment, a smart mirror may be deployed in retail environments to enable virtual garment try-ons and targeted visual promotions. In an embodiment of the present disclosure, the sensors 104 may include, but not limited to, brain-computer interface (BCI) sensors, optical heart rate monitors, eye-tracking cameras, and facial emotion recognition modules, to support multi-modal AR personalization and consumer engagement strategies that surpass traditional, one-dimensional digital marketing approaches.
[0051] The plurality of light detection and ranging and depth sensor 106 may be operationally coupled to the smart device 102 and the light detection and ranging and depth sensors 106 configured to perform real-world three-dimensional (3D) spatial mapping for realistic augmented reality (AR) experiences.
[0052] The light detection and ranging and depth sensor 106 may capture the real-time environment of the user.
[0053] In an embodiment of the present disclosure, the light detection and ranging and depth sensor 106 may be embedded directly into the smart device 102 or externally mounted as modular components connected to the smart device 102. In an exemplary embodiment, in a retail setting, a smart mirror may be connected to an external LiDAR sensor to scan a user's full-body dimensions in real time to enable accurate virtual try-ons. In an exemplary embodiment, the wearable AR glasses may be embedded with miniaturized depth sensors to facilitate immersive navigation, spatial guidance, or contextual advertising based on the user’s surroundings. The light detection and ranging and depth sensor 106 provide a seamless blend of the physical and digital worlds by aiding in generation of precise three-dimensional (3D) spatial data.
[0054] In an embodiment of the present disclosure, the light detection and ranging and depth sensor 106 may include, but not limited to, Time-of-Flight (ToF) sensors, Structured light sensors, and Mechanical or solid-state LiDAR units. In an embodiment of the present disclosure, the light detection and ranging and depth sensor 106 may be responsive, accurate, and adaptable across various deployment scenarios.
[0055] The processing unit 108 may be operationally coupled to the smart device 102 and light detection and ranging and depth sensors 106 and the processing unit 108 further comprising a data acquisition module 110 executed on the processing unit 108 and the data acquisition module 110 configured to capture real-time data from the smart devices 102.
[0056] In an embodiment of the present disclosure, the processing unit 108 may be a microcontroller unit capable of processing incoming data locally. In an embodiment of the present disclosure, the processing unit 108 may be a cloud-based processing server unit for scalable analytics, hosting real-time personalization engines, and hosting blockchain modules to secure and manage sensitive biometric data.
[0057] In an embodiment of the present disclosure, the data acquisition module 110 may serve as an intermediary that facilitates the continuous collection, organization, and forwarding of data streams generated by the smart device 102. In some embodiments, the data acquisition module 110 may be embedded directly within the smart device 104 firmware or operated as a standalone software component within the processing unit 108. In some embodiments, the data acquisition module 110 may perform data pre-processing of the incoming data from the smart device 102.
[0058] In an embodiment of the present disclosure, the data acquisition module 110 may be built using lightweight, low-latency programming frameworks to ensure minimal delay between data capture and processing. To manage high-throughput data streams, the data acquisition module 110 may incorporate buffering mechanisms, timestamping, compression protocols, and secure data transmission layers. The data acquisition module 110 may also ensures compatibility with downstream analysis by forming a foundation for real-time decision-making and AR content delivery. In an embodiment of the present disclosure, the data acquisition module 110 may perform different types of data acquisition including, but not limited to, biometric acquisition, spatial acquisition, and multi-modal acquisition.
[0059] The deep learning module 112 may be executed on the processing unit 108 and operationally coupled to the deep learning module 112 and the deep learning module 112 configured to analyze real-time user data including gaze tracking, emotional response, electroencephalogram (EEG) signals, device movement, demographic data, and interaction history.
[0060] In an embodiment of the present disclosure, the deep learning module 112 may continuously process brainwave activity (via EEG), heart rate variability, eye movement, facial expressions, and other emotion-relevant signals in real time to gain insight into the user's current emotional and cognitive state.
[0061] In an embodiment of the present disclosure, the deep learning module 112 may analyze complex, multi-dimensional user data collected from the smart device 102 in real-time to derive insights that drive personalized and adaptive augmented reality (AR) experiences. In some embodiments, the deep learning module 112 may process input signals such as gaze tracking patterns, emotional responses (e.g., facial expressions or voice sentiment), electroencephalogram (EEG) signals, movement, demographic attributes, and historical interaction data.
[0062] In an embodiment of the present disclosure, the deep learning module 112 may use a combination of convolutional neural networks (CNNs), recurrent neural networks (RNNs), and transformer-based models. In an embodiment of the present disclosure, the deep learning module 112 may identify patterns and correlates them with predictive user behavior.
[0063] In an embodiment of the present disclosure, the deep learning module 112 may perform feature extraction, model inference runtime, and feedback processing. In some embodiments of the present disclosure, the deep learning module 112 may use CNNs to process image and video frames. EEG signals and time-series biometric data may be analyzed using RNNs or LSTMs (long short-term memory networks) to detect cognitive states such as attention or stress. Demographic data and interaction logs may be encoded and used to fine-tune recommendation by the deep learning module 112 using supervised learning techniques. The deep learning module 112 may be also capable of continuous learning for adapting the outputs, as more user data is collected over time. In some embodiments, the deep learning module 112 may utilize edge AI acceleration for low-latency responses and integrate with a federated learning framework to maintain data privacy while improving performance across devices.
[0064] In an exemplary embodiment, for a wearable AR glasses, the deep learning module 112 may interpret gaze direction and EEG activity to adapt visual content on-the-fly. In an exemplary embodiment, for a retail smart mirrors, the deep learning module 112 may assess emotional expressions and interaction history to recommend products that align with the user's mood or preferences.
[0065] The augmented reality engine 114 may be executed on the processing unit 108 and operationally coupled to the light detection and ranging and depth sensors 106 and the augmented reality engine 114 operable for three-dimensional (3D) graphics rendering, image recognition, and adaptive user interface personalization.
[0066] In an embodiment of the present disclosure, the augmented reality engine 114 may perform 3D spatial mapping to anchor digital objects accurately in the physical world, creating realistic overlays and interactions for the user. In an embodiment of the present disclosure, the augmented reality engine 114 may use SLAM (Simultaneous Localization and Mapping) algorithms and AI-based environmental recognition modules to dynamically adjust the AR content based on changes in the environment or user behavior/movement.
[0067] In an embodiment of the present disclosure, the augmented reality engine 114 may create immersive, interactive, and responsive AR experiences by rendering three-dimensional (3D) graphics, recognizing real-world objects and environments, and adapting the marketing content in real time based on user behavior and other environmental inputs. Using the data collected by the light detection and ranging (LiDAR) and depth sensors 106, the augmented reality engine 114 may generate real-world overlays that remain spatially consistent, creating realistic AR interactions. The augmented reality engine 114 may utilize machine vision algorithms to detect and identify faces, gestures, or objects within the user's field of view, and contextual inputs.
[0068] In an embodiment of the present disclosure, the augmented reality engine 114 may perform 3D rendering, image recognition, and user interface (UI) personalization by suggesting dynamic adjustment layout, content, and interaction pathways to ensure a fluid and intuitive experience. The augmented reality engine 114 may also utilize GPU acceleration and edge computing for low-latency rendering and/or data synchronization. In an exemplary embodiment, the augmented reality engine 114 may generate 3D product visualizations for enhancing user engagement, to make virtual content contextually aware, realistic, and responsive to real-time inputs.
[0069] The blockchain-based smart contract module 116 may be executed on the processing unit 108 and the blockchain-based smart contract module 116 configured to securely store anonymized user interaction, and issue non-fungible token (NFT)-based loyalty or engagement rewards.
[0070] In an embodiment of the present disclosure, the blockchain-based smart contract module 116 may provide a secure, transparent, and decentralized infrastructure for managing user interactions and engagement rewards. In some embodiments, the blockchain-based smart contract module 116 may capture anonymized user interaction data such as, but not limited to, engagement duration, gaze focus, click-through behavior, or emotion-based response and stores it on a blockchain ledger to ensure tamper-proof recordkeeping. In an embodiment of the present disclosure, the blockchain-based smart contract module 116 may autonomously trigger predefined actions based on user behavior, such as issuing non-fungible tokens (NFTs) as loyalty or engagement rewards, represented by, but not limited to digital badges, discount tokens, virtual collectibles, or exclusive experiences, thereby incentivizing sustained and meaningful user participation.
[0071] In an embodiment of the present disclosure, the blockchain-based smart contract module 116 implementation may involve integrating a blockchain client such as, but not limited to, Ethereum, Polygon, or a private enterprise blockchain integrated into the AR system architecture. In an embodiment of the present disclosure, the blockchain-based smart contract module 116 may include an NFT generation mechanism.
[0072] The 5G-enabled edge computing infrastructure 118 may be operationally coupled to the deep learning module 112 and the 5G-enabled edge computing infrastructure 118 configured to enable high-speed processing of artificial intelligence (AI)- augmented realty (AR) interactions.
[0073] In an embodiment of the present disclosure, the 5G-enabled edge computing infrastructure 118 may provide ultra-low latency and high-speed processing capabilities for AI-driven augmented reality (AR) interactions. In an embodiment of the present disclosure, the 5G-enabled edge computing infrastructure 118 may perform computationally intensive tasks such as, biometric analysis, object recognition, 3D rendering, and real-time personalization. In an embodiment of the present disclosure, the 5G-enabled edge computing infrastructure 118 may ensure that AR content and AI-driven decisions are rendered and delivered with minimal delay, resulting in seamless, responsive, and highly immersive user experiences.
[0074] In some embodiments, the 5G-enabled edge computing infrastructure 118 may be implemented by deploying edge nodes with GPU or AI accelerator. Such edge nodes may run containers or microservices that host components of the AR and AI pipeline via the deep learning module 112, the augmented reality engine 114, and the data acquisition system 110. In some embodiments, the 5G-enabled edge computing infrastructure 118 may use network slicing to allocate bandwidth specifically for AR data streams, ensuring consistent quality of service (QoS). In some embodiments, the communication protocols such as MQTT or WebRTC may be used to facilitate low-latency data exchange between the 5G-enabled edge computing infrastructure 118 and the smart device 102.
[0075] The neural optimization engine 120 may be operationally coupled to the 5G-enabled edge computing infrastructure 118 and the neural optimization engine 120 configured to read and process subconscious user responses to augmented reality (AR) content, and dynamically optimizing advertising elements as per the user responses.
[0076] The neural optimization engine 120 may adapt the advertising elements based on subconscious emotional responses, interaction data trends, and contextual environmental factors.
[0077] In an embodiment of the present disclosure, the neural optimization engine 120 may leverages inputs such as EEG signals, pupil dilation, micro-expressions, and heart rate variability to infer the user’s emotional and cognitive state as they interact with AR content and analyze the output of the 5G-enabled edge computing infrastructure 118 to determine levels of attention, interest, engagement, or aversion, without requiring explicit input from the user. Based on the received input, the 5G-enabled edge computing infrastructure 118 may dynamically adjusts visual, auditory, or interactive advertising elements such as, but not limited to, product placement, background visuals, call-to-action prompts, and personalization cues.
[0078] In an embodiment of the present disclosure, the neural optimization engine 120 may be implemented using deep neural networks trained on multimodal biometric datasets to recognize patterns of subconscious emotional and cognitive responses. In an embodiment of the present disclosure, the neural optimization engine 120 may employ convolutional or recurrent neural networks to process fused data streams from the sensors 104 and interaction logs. In an embodiment of the present disclosure, the neural optimization engine 120 may be in tandem with the deep learning module 112 to evaluate and classify user response patterns. In an embodiment of the present disclosure, the neural optimization engine 120 may employ optimization algorithms, such as reinforcement learning or genetic programming to trigger content adaptation strategies.
[0079] The system 100 may also include at least one marketer interface 122 integrated with augmented reality (AR)-powered virtual shopping assistants and mixed reality (MR) shopping feature.
[0080] The augmented reality (AR)-powered virtual shopping assistants and mixed reality (MR) shopping feature may help the user in visualizing and making purchasing decisions in real-time augmented reality (AR) stores.
[0081] In an embodiment of the present disclosure, the marketer interface 122 may act as a dynamic bridge between digital marketers and consumers within augmented reality (AR) and mixed reality (MR) shopping environments. Integrated with AR-powered virtual shopping assistants and MR shopping features, this interface enables marketers to design, deploy, and manage interactive shopping experiences in real time. By leveraging real-time data analytics, the marketer interface 122 may allow for the adaptation of marketing strategies based on user behavior and preferences.
[0082] In an embodiment of the present disclosure, the marketer interface 122 may include AI-powered virtual assistants, developed using natural language processing and machine learning algorithms to understand and respond to customer inquiries effectively. In an embodiment of the present disclosure, the marketer interface 122 may include 3D modeling and AR integration for providing high-quality 3D models of products. In some embodiment of the present disclosure, the marketer interface 122 may use data analytics and personalization to deliver personalized recommendations and marketing messages. In some embodiment of the present disclosure, the marketer interface 122 may have cross-platform compatibility.
[0083] FIG. 2 illustrates a flowchart of a computer-implemented method 200 for delivering personalized and real-time augmented reality (AR) marketing content using a distributed augmented reality delivery (DARD) framework, according to an embodiment of the present invention.
[0084] The method 200 may comprise the following steps.
[0085] At 202, collecting real-time user data including gaze direction, emotional response, facial expressions, electroencephalogram (EEG) signals, heart rate, and device orientation via at least one smart device 102.
[0086] At 204, analyzing the collected user data using a deep learning module 112 executed on the processing unit 108 to determine the user's attention level, emotional engagement, and behavioral patterns.
[0087] At 206, rendering and displaying personalized augmented reality (AR) content via an augmented reality engine 114 executed on the processing unit 108 for spatial mapping, 3D model overlays, and adaptive interfaces, using the input provided by a plurality of light detection and ranging and depth sensor 106.
[0088] At 208, dynamically adjusting augmented reality (AR) content via a 5G-enabled edge computing infrastructure 118 based on the user feedback, environmental context, and subconscious responses inferred from the deep learning module 112 output obtained from the augmented reality engine 114 and the deep learning module 112.
[0089] At 210, optimizing augmented reality (AR) marketing content using a neural optimization engine 120.
[0090] The method 200 may further include recording anonymized interaction data and user engagement metrics on a blockchain ledger using a blockchain-based smart contract module 116.
[0091] The method 200 may further include issuing non-fungible token (NFT)-based engagement rewards to the user using the blockchain-based smart contract module 116.
[0092] In a preferred embodiment of the present disclosure, the deep learning module 112 may be implemented using, but not limited to, GPT-4, and TensorFlow AI. The deep learning module 112 may perform predictive analytics. In a preferred embodiment of the present disclosure, the 5G-enabled edge computing infrastructure 118 may perform cloud-based AR rendering via NVIDIA CloudXR, AWS Wavelength ensures seamless, real-time immersive ads.
[0093] In an embodiment of the present disclosure, the method 200 may enable neuro-AR & emotion-aware marketing using brain-computer interfaces (BCI) and biometric feedback received to read subconscious user responses to AR content, dynamically optimizing ad elements. In an embodiment of the present disclosure, the method 200 may generate AI-driven AR avatars help customers visualize & purchase products in real-time AR stores.
[0094] FIG. 3 illustrates an exemplary digital marketer-augmented reality technology-digital customer (DARD) framework 300, according to another embodiment of the present invention.
[0095] The framework 300 may comprise the following.
[0096] At 302, integrating internal factors influencing digital marketer including, strategic vision, innovation culture, customer insights, understanding consumer needs, understanding competitor analysis, providing internal support, and ensuring data security.
[0097] At 304, integrating external factors influencing digital marketer including, consumer trends, market competition, technology advancements, cultural factors, and industry standards.
[0098] At 306, enabling decision making by the digital marketer for creating marketing strategies that incorporate AR, using inputs from internal/external factors to shape AR implementations.
[0099] At 308, using core AR elements to enable interaction between marketers and users.
[0100] At 310, integrating AR hardware and software, sensors, 3D graphics, image recognition, eye tracking mechanism, spatial mapping mechanism, and user interface.
[0101] At 312, checking engagement level of the digital user. The digital user may represent consumers who engage with AR content.
[0102] At 314, analyzing influencing factor for the users that reflects the psychological and experiential setting in which digital users engage, including immersive interaction, engagement, personalization, convenience, novelty, innovation, efficient shopping, and real-time experience.
[0103] At 316, collecting user data from user interactions with AR, location, device data, interaction history, demographic info, feedback, social sharing to optimize marketing.
[0104] At 318, understanding user engagement through emotional and cognitive motivators driving user engagement, curiosity, enjoyment, confidence, creativity, social interaction, trust, security, and status.
[0105] At 320, AR-based content generation.
[0106] At 322, using the user data to generate the personalized AR content.
[0107] At 324, shaping AR strategies with voice and eye tracking, web-based AR, mobile AR, ethical privacy, AR in multi-industries, AR advertising and personalization
[0108] At 326, providing rewards for digital marketers such as, customer analysis, demographic data, geolocation, personalization, brand awareness, experience enhancement, and customer insights.
[0109] At 328, cycle ends and restarts with marketers receiving insights at 301, which go back into refining internal strategies and further driving the AR ecosystem.
[0110] The following table 1 represents the key points of the DARD framework.
Table 1: Representing DARD model
Points referred Description of point
1 Internal factors influencing digital marketer in adopting ART.
2 External factors influencing digital marketer in adopting ART.
3 Digital Marketer implementing ART as DM strategy.
4 Components of ART.
5 AR technology interfacing with customer smart phone.
6 Customer environment influence while using ART service.
7 Factors influencing user in using ART.
8 Data generated by User while using or after using ART.
9 Customer expecting future with ART.
10 ART hardware and software connecting with user data
11 User data is converted in the form of UGC
12 UGC is collected in the form of opportunities by digital marketer
13 Benefits gained by digital marker for implementing ART.
14 Digital marketer predictions and expectations to fulfil customer future needs.

[0111] The disclosed invention offers an advancement over existing digital marketing systems by merging augmented reality (AR) and artificial intelligence (AI) into a unified, scalable platform. One of the key advantages is the ability of the disclosed invention to deliver hyper-personalized consumer experiences with emotion AI and neurotechnology. Unlike traditional platforms that rely on static personalization models, the disclosed DARD framework dynamically adapts content based on real-time biometric and behavioural inputs received from the smart device 102, including facial expressions, heart rate, and EEG data.
[0112] Another key advantage emerges from the integration of blockchain-based data privacy and security mechanisms via the blockchain-based smart contract module 116. While most digital marketing systems rely on centralized data storage, the disclosed DARD framework ensures that all user data and marketing transactions are secured through decentralized blockchain protocols that enhances consumer trust but also facilitates transparent, audit-friendly interactions between marketers and consumers.
[0113] The disclosed invention may enable real-time, context-aware engagement through the smart devices 102 and the sensor 104. The system 100 responds to the users' surroundings and behaviours to present timely and relevant AR content using the light detection and ranging and depth sensors 106, and an augmented reality engine 114.
[0114] The use of the 5G-enabled edge computing infrastructure 118 may ensure seamless delivery of high-fidelity, cloud-rendered AR experiences through the smart devices 102. The disclosed invention allows marketers to understand and respond to consumer sentiment with precision and accordingly personalize the AR environment to match the user’s mood or intent. The disclosed invention uses blockchain-based assets such as NFTs and loyalty tokens to ensure active user participants.
[0115] Further, the DARD framework supports mixed reality (MR) hybrid shopping experiences, enabling consumers to engage in holographic commerce and social AR shopping scenarios that blend the digital and physical worlds. For marketers, the DARD framework serves as a comprehensive, end-to-end solution—offering tools to create, deploy, and analyse AR campaigns in a seamless and integrated environment.
[0116] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
[0117] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims.
, Claims:I/We Claim:
1. A system (100) for digital marketer-augmented reality technology-digital customer (DARD) framework, the system (100) comprising:
at least one smart device (102) collecting real-time data from a user,
wherein the smart device (102) is integrated with a plurality of sensors (104), ‘
wherein the smart device (102) is augmented reality (AR)-enabled;
a plurality of light detection and ranging and depth sensor (106) operationally coupled to the smart device (102), the light detection and ranging and depth sensors (106) configured to perform real-world three-dimensional (3D) spatial mapping for realistic augmented reality (AR) experiences;
a processing unit (108) operationally coupled to the smart device (102) and light detection and ranging and depth sensors (106), the processing unit (108) further comprising:
a data acquisition module (110) executed on the processing unit (108), the data acquisition module (110) configured to capture real-time data from the smart devices (102);
a deep learning module (112) executed on the processing unit (108) and operationally coupled to the deep learning module (112), the deep learning module (112) configured to analyze real-time user data including gaze tracking, emotional response, electroencephalogram (EEG) signals, device movement, demographic data, and interaction history;
an augmented reality engine (114) executed on the processing unit (108) and operationally coupled to the light detection and ranging and depth sensors (106), the augmented reality engine (114) operable for:
three-dimensional (3D) graphics rendering;
image recognition; and
adaptive user interface personalization;
a blockchain-based smart contract module (116) executed on the processing unit (108), the blockchain-based smart contract module (116) configured to:
securely store anonymized user interaction; and
issue non-fungible token (NFT)-based loyalty or engagement rewards;
a 5G-enabled edge computing infrastructure (118) operationally coupled to the deep learning module (112), the 5G-enabled edge computing infrastructure (118) configured to enable high-speed processing of artificial intelligence (AI)- augmented realty (AR) interactions;
a neural optimization engine (120) operationally coupled to the 5G-enabled edge computing infrastructure (118), the neural optimization engine (120) configured to:
read and process subconscious user responses to augmented reality (AR) content; and
dynamically optimizing advertising elements as per the user responses.
2. The system (100) as claimed in claim 1, wherein the neural optimization engine (120) adapts the advertising elements based on subconscious emotional responses, interaction data trends, and contextual environmental factors.
3. The system (100) as claimed in claim 1, wherein the system (100) also includes at least one marketer interface (122) integrated with augmented reality (AR)-powered virtual shopping assistants and mixed reality (MR) shopping feature.
4. The system (100) as claimed in claim 3, wherein the augmented reality (AR)-powered virtual shopping assistants and mixed reality (MR) shopping feature helps the user in visualizing and making purchasing decisions in real-time augmented reality (AR) stores.
5. The system (100) as claimed in claim 1, wherein the smart device (102) includes smart mirrors, wearables, and smart displays for personalized augmented reality (AR) experiences.
6. The system (100) as claimed in claim 1, wherein the sensors (104) include biometric sensors for capturing brainwave, heart rate, eye-tracking, and emotion data to optimize augmented reality (AR) advertisements.
7. The system (100) as claimed in claim 1, wherein the light detection and ranging and depth sensor (106) captures the real-time environment of the user.
8. A computer-implemented method (200) for delivering personalized and real-time augmented reality (AR) marketing content using a distributed augmented reality delivery (DARD) framework, the method (200) comprising:
collecting real-time user data including gaze direction, emotional response, facial expressions, electroencephalogram (EEG) signals, heart rate, and device orientation via at least one smart device (102);
analyzing the collected user data using a deep learning module (112) executed on the processing unit (108) to determine the user's attention level, emotional engagement, and behavioral patterns;
rendering and displaying personalized augmented reality (AR) content via an augmented reality engine (114) executed on the processing unit (108) for spatial mapping, 3D model overlays, and adaptive interfaces, using the input provided by a plurality of light detection and ranging and depth sensor (106);
dynamically adjusting augmented reality (AR) content via a 5G-enabled edge computing infrastructure (118) based on the user feedback, environmental context, and subconscious responses inferred from the deep learning module (112) output obtained from the augmented reality engine (114) and the deep learning module (112); and
optimizing augmented reality (AR) marketing content using a neural optimization engine (120).
9. The method (200) as claimed in claim 8, wherein the method (200) further includes recording anonymized interaction data and user engagement metrics on a blockchain ledger using a blockchain-based smart contract module (116).
10. The method (200) as claimed in claim 8, wherein the method (200) further includes issuing non-fungible token (NFT)-based engagement rewards to the user using the blockchain-based smart contract module (116).

Documents

Application Documents

# Name Date
1 202541043527-STATEMENT OF UNDERTAKING (FORM 3) [05-05-2025(online)].pdf 2025-05-05
2 202541043527-REQUEST FOR EARLY PUBLICATION(FORM-9) [05-05-2025(online)].pdf 2025-05-05
3 202541043527-POWER OF AUTHORITY [05-05-2025(online)].pdf 2025-05-05
4 202541043527-FORM-9 [05-05-2025(online)].pdf 2025-05-05
5 202541043527-FORM FOR SMALL ENTITY(FORM-28) [05-05-2025(online)].pdf 2025-05-05
6 202541043527-FORM 1 [05-05-2025(online)].pdf 2025-05-05
7 202541043527-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [05-05-2025(online)].pdf 2025-05-05
8 202541043527-DRAWINGS [05-05-2025(online)].pdf 2025-05-05
9 202541043527-DECLARATION OF INVENTORSHIP (FORM 5) [05-05-2025(online)].pdf 2025-05-05
10 202541043527-COMPLETE SPECIFICATION [05-05-2025(online)].pdf 2025-05-05
11 202541043527-Proof of Right [06-05-2025(online)].pdf 2025-05-06