Abstract: A framework for personalised sales presentations, is disclosed herein. the disclosed framework (100) broadly comprises: command centre (102); a creator unit (104); and a hub unit (106). The disclosed framework (100) offers at least the following advantages: is simple in construction; is cost effective; offers unified asset management, multi‑sensory feedback, low/no code-paradigm enables beginners to create content; CRM integration, presenter control mechanisms, and delivers next‑generation VR sales presentation by enhancing end-user engagement, and personalisation.
Description:TITLE OF THE INVENTION: A FRAMEWORK FOR PERSONALISED SALES PRESENTATIONS
FIELD OF THE INVENTION
The present disclosure is generally related to sales presentations. Particularly, the present disclosure is related to personalised sales presentations. More particularly, the present disclosure is related to: a framework for personalised sales presentations for improved sales performance.
BACKGROUND OF THE INVENTION
In heavy machinery industry, transporting heavy equipment to the end user's site for providing sales presentations may not be possible and/or are challenging. Traditional in-person heavy equipment sales presentations involve significant cost, logistics, and risk of equipment damage during transport. Scheduling and accessing the sales presentations are challenging as key decision-makers are rarely available at remote sites at the same time.
Buyers cannot interactively explore configurations or run “what-if” comparisons, which limits both decision-making speed and confidence. Remote or international clients are often excluded from full sales presentation experiences, resulting in loss of sales opportunities. Sales teams may also lack data-driven insights into prospect engagement or interests during traditional sales presentations. Additionally, in-person sales presentations are not scalable and cannot be personalised at scale, reducing reach and efficiency. Regulations or site access restrictions (for example, in mining, military, or pandemic situations) can prevent in-person sales presentations altogether. Growing environmental and sustainability pressures are also a concern, where virtual sales presentations help reduce carbon footprint and waste.
Frameworks for personalised sales presentations in virtual‑reality (VR) environment have significantly evolved from early 3D “malls” to advanced personalisation engines using artificial intelligence (AI) techniques for optimising user activities. The interactive frameworks for personalised sales presentations introduced three‑way communication among participants, performers, and virtual worlds enhancing engagement and interactivity. Innovations tailored for retail businesses customise in VR store layouts and shopping carts based on purchase history of end-users and personalised filters.
Empirical research shows that even low‑immersion VR environments can improve decision‑making and sales performance particularly in real estate applications. Neuromarketing studies further confirms that immersive VR stimuli enhance consumer trust and purchase intent. Meanwhile, integration of AI/Machine learning (ML) techniques into VR marketing campaigns yields highly personalised brand experiences through dynamic content adaptation.
Conventionally, frameworks for personalised sales presentations are known in the art, however, they possess drawbacks such as: relying heavily on human interaction; scripted slides or video calls; lack of user engagement, personalization, and/or adaptability required for modern sales processes: and requires complex programming skills so that, only experts are able to create the sales presentations.
There is, therefore, a need in the art, for: a framework for personalised sales presentations, that overcomes the aforementioned drawbacks and shortcomings.
SUMMARY OF THE INVENTION
A framework for personalised sales presentations, is disclosed herein. Said framework broadly comprises: a command centre; a creator unit; and a hub unit.
The command centre serves as a centralised platform for monitoring, managing and coordinating operations of the framework. Said creator serves as a content creation platform for a user for creating a new personalised sales presentation. The hub unit is configured as a unified delivery platform for presenting the sales presentation to a client.
In an embodiment, said command centre, said creator unit, and said hub unit are communicably associated with each other. The operations of said creator unit and said hub unit are managed, and controlled by the command centre.
In an embodiment, said command centre broadly comprises: a data storage module; an audit and logs storage module; and an asset storage module.
Said data storage module is configured to store: security, compliance and audit data; user-role management data; an organization and account management data; and Customer Relationship Management and Integration Layer data.
Said audit and logs storage module that provides immutable and tamper-evident record of every significant action within the command centre.
Said asset storage module that serves as a digital warehouse to securely store, organize, and deliver the digital assets.
In an embodiment, said creator unit broadly comprises: a content authoring module; a project management module; and an asset management module.
Said content authoring module is configured to provide guided content authoring flow to assemble digital assets, with a low code/no-code paradigm.
Said project management module is configured to manage, lifecycle of the sales presentation activities and the user activities from inception to delivery of the sales presentation.
Said asset management module is configured to organise, store, optimise, and distribute digital assets.
In an embodiment, said hub unit broadly comprises: a presenter application, a virtual reality application, and a web application.
Said presenter application is configured to deliver sales presentations, where said presenter application is operated on a tablet or laptop;
Said virtual reality application is configured to display the sales presentations to enable client to view the sales presentations, where said virtual reality application is operated on a virtual reality headset; and
Said web application is configured to monitor the sales presentations by a sales manager, remotely in real-time, where said web application is operated on a tablet or laptop.
In yet another embodiment of the present disclosure, output created by the creator unit is in a .onexr file format.
The disclosed framework offers at least the following advantages: is simple in construction; is cost effective; offers unified asset management, multi‑sensory feedback, low/no code-paradigm enables beginners to create content; CRM integration, presenter control mechanisms, and delivers next‑generation VR sales presentation by enhancing end-user engagement, and personalisation.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 illustrates a framework for personalised sales presentations, in accordance with an embodiment of the present disclosure;
Figure 2 illustrates a command centre of a framework for personalised sales presentations, in accordance with an embodiment of the present disclosure;
Figure 3 illustrates dashboard in a command centre of a framework for personalised sales presentations, in accordance with an embodiment of the present disclosure;
Figure 4 illustrates role category and role-based access control mechanism in a command centre a framework for personalised sales presentations, in accordance with an embodiment of the present disclosure;
Figure 5 illustrates a user account dashboard in a command centre, representing username, role, group, and user account status, in accordance with an embodiment of the present disclosure;
Figure 6 illustrates package allocation in a command centre a framework for personalised sales presentations, in accordance with an embodiment of the present disclosure;
Figure 7 illustrates content packages available for pre-view in a command centre a framework for personalised sales presentations, in accordance with an embodiment of the present disclosure;
Figure 8 illustrates various views of an analytics dashboard of an audit and event log storage module, in accordance with an embodiment of the present disclosure;
Figure 9 illustrates third party integration through Application Programming Interface (API) keys, in accordance with an embodiment of the present disclosure;
Figure 10 illustrates a creator unit of a framework for personalised sales presentations, in accordance with an embodiment of the present disclosure;
Figure 11a and figure 11b illustrate a scene editor and a work flow editor, respectively, in accordance with an embodiment of the present disclosure;
Figure 12a and Figure 12b illustrate positioning of digital assets and scale up and/or down digital assets, respectively, in accordance with an embodiment of the present disclosure;
Figure 13a and Figure 13b illustrate data for charts and chart visualisation, respectively, in accordance with an embodiment of the present disclosure;
Figure 14a and Figure 14b illustrate exporting project from a creator unit and publishing the project in a command centre respectively, in accordance with an embodiment of the present disclosure;
Figure 15a and figure 15b illustrates project sharing and creating new project, respectively, in a creator unit, in accordance with an embodiment of the present disclosure;
Figure 16 illustrates an "XRAutoFlow" mechanism of an assistance module, in accordance with an embodiment of the present disclosure;
Figure 17a and Figure 17b illustrate a multi-language flow and, AI generated audio, respectively, in accordance with an embodiment of the present disclosure;
Figure 18 illustrates a hub unit, in accordance with an embodiment of the present disclosure;
Figure 19 illustrates a live control and monitoring interface of a presenter application, in accordance with an embodiment of the present disclosure;
Figure 20 illustrates live control and monitoring of sales presentations remotely, in accordance with an embodiment of the present disclosure;
Figure 21a and Figure 21b illustrate analytics and sales presentations recording, respectively, in accordance with an embodiment of the present disclosure;
Figure 22 illustrates login of an end-user in a VR application, in accordance with an embodiment of the present disclosure;
Figure 23 illustrates dashboard of a VR application, in accordance with an embodiment of the present disclosure;
Figure 24a and Figure 24b illustrate a VR application interface and a corresponding presenter application interface, respectively, in accordance with an embodiment of the present disclosure;
Figure 25 illustrates a voice assistance mechanism, listening the voice during sales presentation, in accordance with an embodiment of the present disclosure;
Figure 26 illustrates chat functionality of a voice assistance mechanism, in accordance with an embodiment of the present disclosure;
Figure 27 illustrates multi-user collaboration through an external device, in accordance with an embodiment of the present disclosure;
Figure 28 illustrates Question & Answer (Q&A) panel for real-time interaction through an external device, in accordance with an embodiment of the present disclosure;
Figure 29 illustrates real-time analytic data, in accordance with an embodiment of the present disclosure; and
Figure 30 illustrates the workflow of an auto animation generator mechanism of a VR application, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE INVENTION
Throughout this specification, the use of the words “comprise” and “include”, and variations, such as “comprises”, “comprising”, “includes”, and “including”, may imply the inclusion of an element (or elements) not specifically recited. Further, the disclosed embodiments may be embodied, in various other forms, as well.
Throughout this specification, the use of the word “framework”” is to be construed as: “a set of technical components (also referred to as “members”) that are communicatively and/or operably associated with each other, and function together, as part of a mechanism, to achieve a desired technical result”.
Throughout this specification, the use of the words “communication”, “couple”, and their variations (such as communicatively), is to be construed as being inclusive of: one-way communication (or coupling); and two-way communication (or coupling), as the case may be, irrespective of the directions of arrows, in the drawings.
Throughout this specification, where applicable, the use of the phrase “at least” is to be construed in association with the suffix “one” i.e. it is to be read along with the suffix “one”, as “at least one”, which is used in the meaning of “one or more”. A person skilled in the art will appreciate the fact that the phrase “at least one” is a standard term that is used, in Patent Specifications, to denote any component of a disclosure, which may be present (or disposed) in a single quantity, or more than a single quantity.
Throughout this specification, where applicable, the use of the phrase “at least one” is to be construed in association with a succeeding component name.
Throughout this specification, the use of the word “plurality” is to be construed as being inclusive of: “at least one”.
Throughout this specification, the use of the phrase “external device”, and its variations, is to be construed as being inclusive of: the cloud; remote servers; an in-house sever of a user; and/or the like.
Throughout this specification, the use of the phrase “application on an external device”, and its variations, is to be construed as being inclusive of: application installable on an external device; a sales presentation on a framework run on an external device; website hosted on an external device; web application installed on an external device; website accessible from an external device; web application accessible from an external device; and/or the like.
Throughout this specification, the disclosure of a range is to be construed as being inclusive of: the lower limit of the range; and the upper limit of the range.
Throughout this specification, the words “the” and “said” are used interchangeably.
Throughout this specification, the use of the word “plurality” is to be construed as being inclusive of: “at least one”.
Throughout this specification, the phrases “at least a”, “at least an”, and “at least one” are used interchangeably.
Throughout the specification, the words “user” is to be construed as: “a person who is using a framework for personalised sales presentations, and/or the like”.
Throughout the specification, the word “end-user” is to be construed as: “a person to whom a sales presentation is presented to and /or the like”.
Throughout the specification the use of the word “end-user”, “customer” and “client” are used interchangeably.
Throughout the specification, the word “admin” is to be construed as: “a type of user who manages user roles, organises and maintains workflows and /or the like”.
Throughout the specification the use of the phrase “digital assets” is to be construed as: “any digitally created object or element created digitally, and can be accessed, managed, published, shared, and/or distributed by users”.
Throughout the specification the use of the phrase “digital assets” and the phrase “3D assets” can be used interchangeably.
Throughout the specification, the word “project” is to be construed as: “sales presentation or content package”.
Throughout the specification, the phrases “sales presentation session” and “session” are used interchangeably.
Also, it is to be noted that embodiments may be described as a method. Although the operations, in a method, are described as a sequential process, many of the operations may be performed in parallel, concurrently, or simultaneously. In addition, the order of the operations may be re-arranged. A method may be terminated, when its operations are completed, but may also have additional steps.
A framework (100) for personalised sales presentations (hereinafter referred to as “framework”), is disclosed herein. In an embodiment of the present disclosure, as illustrated in Figure 1, the disclosed framework (100) broadly comprises: a command centre (102); a creator unit (104); and a hub unit (106).
In another embodiment of the present disclosure, said framework (100) utilises an extended reality (XR) platform that allows a user to create, manage and deliver high resolution, interactive sales presentations (e.g. product demos) in heavy machinery industries.
In yet another embodiment of the present disclosure, the user includes, but is not limited to, a content creator or a subject matter expert, sales professionals, sales manager, a sales trainer, and/or the like.
In another embodiment of the present disclosure, said command centre (102), said creator unit (104), and said hub unit (106) are communicably associated with each other.
In yet another embodiment of the present disclosure, operations of said creator unit (104) and said hub unit (106) are managed, and controlled by the command centre (102).
Said command centre (102) serves as a centralised platform equipped for monitoring, managing and coordinating operations of the framework (100).
As illustrated in Figure 3, said command centre (102) provides a dashboard-driven user experience, allows the admin and the sales managers to see high-level metrics (usage, assignments, feedback), perform complex actions (bulk assignment, asset upload, data export), and configure organizational settings. In addition, said command centre offers secure user authentication (including Single Sign On), flexible role management, and detailed audit trails for all the user actions.
As illustrated in Figure 2, said command centre broadly comprises: a data storage module (108); an audit and logs storage module (110); and an asset storage module (112).
Said data storage module (108) is configured to store data including: a first data, a second data, a third data, and a fourth data used for different purposes at different locations as per the user requirements.
Said first data is security, compliance and audit data, that includes data related to security mechanisms involved in the command centre (102), such as multi-factor authentication (MFA), role-based access control (RBAC), encryption at rest and in transit, vulnerability scanning, and proactive security posture reviews.
The audit data includes every significant action (user login, digital asset uploading, sales presentation assignment, export, AI query and/or the like) that is immutably logged with timestamp, the user identity, originating IP/device, and context. These logs are not only vital for forensic investigation in the event of a security incident, but also serve as evidence for regulatory audits (GDPR, CCPA, SOC2, HIPAA, etc.). Consent management is another compliance requirement, where the command centre (102) tracks every instance of user/customer consent for data processing, analytics, and AI assistance. In addition, the user is able to meet strict legal requirements around data lifecycle management by implementing data retention, deletion, and eDiscovery workflows.
Moreover, this mechanism supports integration with enterprise SIEM (Security Information and Event Management) tools, periodic penetration testing, and automated compliance reporting. In addition, policy enforcement logs cover enforcement actions related to password complexity, login attempts, device fingerprinting is centrally managed. For organizations in regulated industries, the ability to export full audit logs or place data under “legal hold” is necessary.
Said second data is user-role management data, that includes user profiles of various user, role category of each user, as illustrated in Figure 4. With RBAC functionality, the organizations can use pre-defined roles or create custom roles for each user. For example, the admin manages users and global analytics, while a presenter can only access assigned content package/sales presentations.
In yet another embodiment of the present disclosure, permission scopes/user access can be limited, not only by the role but also by organization, team, project, region, or individual asset. In addition, the RBAC functionality is enforced both at the user interface level and at the API/service level. Any unauthorised access, violation or escalation are logged for security review. Said RBAC mechanism supports periodic access reviews and automatic escalation for special circumstances (e.g., temporarily elevating a presenter to sales manager for a critical sales presentations). Since, the organizations often need to integrate with external identity providers, RBAC is compatible with Single-Sign On (SSO)/System for Cross-domain Identity Management (SCIM) protocols, syncing roles from the customer’s Identity Providers (IdPs). RBAC also extends to content access controls, restricting the user to access or modify any digital assets, projects, or analytics dashboards.
As illustrated in Figure 5, the third data is an organization and account management data, which include: data related to registering and managing clients, data related to an admin’s register, team memberships and the user account status (active or inactive) and deactivation records.
When a new client is registered/onboarded, the admin sets up aa subdomain (for example, acme.onexrportal.us) for the client, branding (logo, color scheme), and default policies (password, sales presentations, notifications, integrations). This registration is necessary for the client to access the sales presentation experience as and when required.
Now, the admin can provision users to groups or teams, assign their roles, delegate admin rights, and define quotas for digital assets, sales presentations, or other resources.
Said user provisioning can be accomplished through manual entry, bulk import (Comma Separated Values (CSV)/ Application Programming Interface (API)), or federated SSO (for example, Azure AD, Google Workspace, Okta).
In addition, account management of the user as well as the client can be done through external tools, such as; Customer Relationship Management (CRM), Business Intelligence (BI), and/or the like. All the activities such as creation, updation, deactivation, permission changes are logged for audit.
The framework (100) supports self-service account management for end-users (profile updates, password resets, MFA setup), and the admins can enforce global settings, review access history, and generate compliance attestations.
Said fourth data is CRM & Integration Layer data, that includes data related to external CRMERP (Enterprise Resource Planning), and BI (Business Intelligence) platforms.
In yet another embodiment of the present disclosure, CRM & Integration Layer bridges the outside enterprise world to the command centre (102).
Both real-time and batch data synchronization, allows the end-user and opportunity data, persona attributes, and sales presentations outcomes to flow seamlessly between the command centre (102) and external systems. Robust, secure APIs (REST/GraphQL, webhooks, OAuth2/SAML authentication) are used to manage connectivity, mapping, and data transformation. Therefore, said data storage module (108) also functions as a centre for authentication and authorisation of the user as well as clients.
Said audit and event logs storage module (110) provides immutable and tamper-evident record of every significant action or event within the command centre (102), including the user logins, client logins, sales presentation uploads, assignments, approvals, edits, deletions, permission changes, and AI assistant interactions.
Each log entry is timestamped, attributed to initiating user or system process, and enriched with contextual metadata such as affected resources, IP address, and prior state information. The System process refers to any automated or scheduled task performed by the platform itself rather than directed (or initiated) by a user.
These log entry supports multiple regulatory needs of the user such as;
Regulatory Compliance: The log entry provide evidence for regulatory requirements like General Data Protection Regulation (GDPR), System and Organization Controls 2 (SOC2), Health Insurance Portability and Accountability Act (HIPAA), and California Consumer Privacy Act (CCPA), enabling organizations to prove that sensitive data is handled appropriately and that all the user activities are traceable and auditable.
Security Monitoring and Incident Response: Security teams can use the log entry to detect suspicious behaviour, investigate incidents like unauthorized access or data exports, and enforce compliance policies.
Operational Troubleshooting: The admin can rely on the log entry to reconstruct workflows, diagnose errors, and resolve the user disputes or ambiguities.
Figure 8 illustrates an analytics dashboard of the audit and event log storage module (110), and provides graphical representations of the user activity, session metrics, platform usage, and engagement statistics. Key features include line charts for session trends, bar charts for the user activity segmentation, summary cards for categories, packages, users, and groups, as well as pie and radar charts for platform and role-based performance analysis. These visualizations enable admins and managers to monitor usage of the framework (100), assess content effectiveness, and optimize the user engagement, in real time.
Said audit and event log storage module (110) also enables the authorized admins to search, filter, and export logs related to these mechanisms, ensure a comprehensive and accountable platform management environment.
Mechanisms and activities within the audit and event log storage module (110) includes: client subdomain and branding/white labelling, sales pitch assignment workflow, and admin tools.
The client subdomain and branding/white-labelling mechanism allow each client to have a distinct presence within the command centre (102).
As discussed above, when a new client is onboarded, said client receives a dedicated subdomain (for example, yourcompany.onexrportal.us). This not only provides brand continuity but also supports logical data segregation and customized policy enforcement.
Branding options allow each client to upload logos, choose color palettes, and customize email/SMS templates, whereas white-labelling mechanism enable custom sender addresses, legal/compliance footers, and interface modifications to reflect the client’s visual identity. This configuration settings are applied uniformly across the command centre (102), the creator unit (104), and the hub unit (106).
The sales pitch assignment workflow is configured to streamline the distribution and management of immersive sales content across various teams and users. This mechanism enables the sales managers and admin to allocate VR/AR sales presentations/or training modules, or content packages to individual users, groups, or entire teams in a structured and trackable manner as illustrated in Figure 5.
Said sales pitch assignment workflow begins with the sales presentation content selection either from pre-built templates or custom-created packages specific to target clients. The package allocation depends on the product type, groups/teams (for example, sales, marketing, creators) and the available content packages, as illustrated in Figure 6.
The sales pitch assignment workflow supports both manual and rule-based assignments; for example, a new pitch can automatically be assigned to all presenter handling a particular product, or triggered when a new opportunity is created in the CRM.
Flexibility and automation mechanism enables the admin to schedule the package allocations to recur (for example, monthly product updates), trigger the package allocation based on CRM events, or escalate uncompleted tasks to the sales managers and notifies the users via in-app alerts, email, or SMS, ensuring they are aware of new package allocations and deadlines.
The sales managers can also perform real-time monitoring of the assignment status, track acceptance and completion rates, and enable quick interventions if needed. Feedback loops and analytics mechanism help organizations understand which sales pitches work best for different personas or market segments, allowing continuous optimization of content.
Admin tools enable the admins and the sales managers with comprehensive capabilities to manage users, resources, services, security, and system configurations remotely and efficiently. Core capabilities of the admin tools include the user management (for example, adding, editing, deactivating accounts), role and group assignment, access reviews, and policy enforcement (such as; password requirements, MFA, and/or sales presentations timeouts).
Beyond the user administration, the admin tools encompass content and asset oversight. This includes bulk uploading assets, managing versioning and approvals, archiving or restoring old projects, and controlling the lifecycle of sales or training materials.
Approval workflows are major components that allows content to be reviewed and signed off by the designated users before being published or assigned. The approval workflows in the admin tools allow certain actions (such as publishing content, exporting projects, or making major system changes) to require a review and explicit approval by authorized users before becoming active. This ensures quality control and compliance. In addition, rejected or requested-changes can also be tracked and cycled back for edits.
The admin tools also offer the dashboards to visualize organizational metrics such as content usage, sales presentation activities, assignment progress, and the user engagement allows top management to spot trends or intervene when needed.
For compliance and security, admins can configure notification rules, customize branding and subdomain settings, manage integrations with external systems (CRM, SSO, etc.), and handle escalations or exceptions in workflows. Audit trails, export options, and advanced search/filtering further empowers the admins to operate efficiently and transparently. In larger organizations, granular permission controls allow for delegated admin rights, so that local or team-level managers can take ownership of their domains without compromising overall integrity of the platform.
Said asset storage module (112) serves as a digital warehouse to securely store, organize, and deliver a wide range of digital assets.
In yet another embodiment of the present disclosure, the digital assets include: 3D models, videos, images, audio files, the content packages, analytics exports, audit logs, hotspots, and logic nodes, animation and/or the like.
In yet another embodiment of the present disclosure, a storage infrastructure of said asset storage module (112) is a cloud-native infrastructure, such as Amazon Web Services (AWS), and Azure Blob Storage, to ensure rapid global access, robust redundancy, and seamless scalability, as asset libraries grow.
In yet another embodiment of the present disclosure, said digital assets are encrypted at rest and in transit, with access tightly controlled by RBAC and audit logs capturing every upload, download, or change. Versioning enables tracking of asset history, rollback to previous states, and support for draft/approved/archived asset lifecycle stages. The framework (100) supports chunked uploads and background processing for large assets, optimizing bandwidth and the end-user experience. Further, automated backup and disaster recovery workflows protect against data loss.
Said asset storage module (112) is integrated with an asset management module (118) to ensure that the digital assets of the end-user are indexed, tagged, and easily searchable supporting metadata, such as title, type, description, usage, license, and expiry. Said asset storage module (112) supports both online and offline workflows, allowing content to be preloaded for field use where connectivity is limited.
Mechanisms within the asset storage module (112) includes: product, asset, and content management; hub applications; analytics, event logging and reporting; automated notifications; and application and programming interfaces (APIs) and plugin marketplace.
The digital assets are stored securely in the asset storage module (112). The users can utilise digital asset and content management tools to upload, tag, categorize, version, and assign said digital assets to projects, teams, or workflows. A product catalogue enables tracking of Stock Keeping Unit (SKUs), configurations, product descriptions, and linking to corresponding assets. Digital asset lifecycle management covers upload, optimization (e.g., mesh decimation, texture compression), approval workflows, version control, and expiration/archive policies.
Said digital assets carry metadata for search, filtering, and analytics that enables organizations to track usage, popularity, and compliance (licensing, expiry, etc.) and are stored in the asset storage module (112). The content management feature ensures that only reviewed, approved, and current content is delivered to end users. It supports bulk operations (import, export, sync), local/offline use cases, and audit trails for every action.
Figure 7 illustrates the content packages available for preview in the command centre (102). Once the digital assets are approved and assigned, the creator unit (104) and the hub application (106) can use the digital assets.
The hub application broadly comprises: a presenter application (140); VR application (142); and a web application (144). The presenter application (140) is operated on an external device (a tablet or a laptop) of a presenter during sales presentation. Whereas, the VR application (142) is operated on the external device (e.g. a VR headset) displays the sales presentation to enable client to view the sales presentations. Further, the web application (144) is used by the sales manager in different locations (remote locations) to monitor the sales presentations through the external device (laptop/desktop in remote locations). The hub unit (106) can fetch the digital assets directly from the asset storage module (112), caching them for offline use when necessary.
The hub application (106) is synchronized with the command centre (102) for content updates, sales presentation assignment, and analytics upload. They enforce RBAC, track sales presentation logs, and support offline/online parity.
The RBAC can be applied in following areas: firstly, in the user interface where only the user with appropriate roles see and can act on certain features; secondly, at the API/service level, where backend checks ensures that the users cannot access or modify unauthorized data; and thirdly, for assets, content, analytics, and workflows, only the authorised user can access with explicit permission, as defined by their role, organisation, and/or team. This modular delivery model ensures that organizations can engage clients and teams in any mode such as on-site, remote, connected, or disconnected while maintaining full security, compliance, and data integrity.
During and after the sales presentation, the user logins, asset uploads, content assignments, AI assistant interactions, the user interactions, sales presentation durations, content engagement, and training completion metrics and feedback submissions are timestamped and recorded in an immutable audit log. This log data provides a complete trail for transparency, troubleshooting, security investigations, and regulatory compliance with standards like GDPR and SOC2.
This data is stored securely as structured logs or reports linked with the corresponding assets in the asset storage module (112).
In yet another embodiment, said data are processed into interactive dashboards and comprehensive reports that enable admins and the sales managers to monitor key performance indicators (KPIs) such as active users, sales presentation durations, content popularity, and conversion rates.
The data harvested from the user activity, content engagement, sales presentation participation, asset utilization, and workflow outcomes can be aggregated and visualised through the analytics component of the analytics event logging and reporting mechanism.
Customizable reporting tools of the analytics, event logging and reporting mechanism allow users to drill down by timeframes, teams, roles, content types, and geography. Reports can be exported in multiple formats (CSV, PDF) for further analysis or integrated with external BI platforms such as Power BI and Tableau.
The automated notification mechanism triggers an automated reminder via email and push notifications, when the presenter has assigned a training module within the set deadline. This ensures that the users are always informed about content assignments, updates, approvals, reminders, and critical events, which references that the asset metadata and the user activity are stored in the asset storage module (112).
Apart from basic notifications, said automated notification feature extends automation capabilities, where the admins can define rules and triggers messages (for example, “notify all team members when a new asset is approved,” or “escalate to the sales manager if an assignment is not started within three days”). Automations may also include content assignment based on CRM events, recurring task scheduling, or real-time alerts for unusual activity (such as repeated failed logins). Integration of the automated notification mechanism with external systems through webhooks further extends automation possibilities, for example, syncing updates to a connected CRM or triggering an external workflow.
The APIs enables secure and versioned endpoints (REST/GraphQL) for users, organizations, assets, content assignments, analytics, and workflows. These APIs enable organizations to integrate the command centre (102) with external systems such as CRMs (Salesforce, HubSpot), ERPs, BI platforms, or custom line-of-business applications.
The plugin marketplace takes extensibility a step further by allowing internal teams or third-party developers to create, deploy, and manage plugins that extend portal functionality, as illustrated in Figure 9. Plugins can add custom analytics dashboards, new workflow nodes, specialized integration connectors, or unique business logic without impacting the core platform’s stability or upgradeability. Each plugin runs in a sandboxed environment, with strict API permission controls and security review before deployment.
Admins can browse, approve, install, configure, update, or disable plugins per user or team. All plugin activity is logged for audit, and plugins support versioning and rollback for safe experimentation and such data are stored in the asset storage module (112).
Said creator unit (104) serves as a content creation platform that enables the user to create a new personalised sales presentation, that are specific to client (i.e. end-user). Within the creator unit (104), the content encompasses all the digital assets developed for a project to produce immersive experiences.
In yet another embodiment of the present disclosure, in order to initialise a new sales presentation, the user (content creator) can select a template, copy of an existing project, or can initialise from scratch depending on the requirements.
As illustrated in Figure 10, said creator unit (104) broadly comprises: a content authoring module (114); a project management module (116); and an asset management module (118).
Said content authoring module (114) is configured to provide a guided authoring flow to assemble the digital assets, with a low-code/no-code paradigm. This enables the user to create and assemble interactive sales presentations utilising intuitive visual interface and drag-and-drop tools, without requiring programming skills.
Scene configurations, narrative flow, and branching decision points are constructed using visual editors, allowing the users to focus on storytelling and business objectives rather than technical hurdles.
In yet another embodiment of the present disclosure, the content authoring module (114) enhances iteration speed users to rapidly prototype, test, and revise experiences in response to feedback or business needs.
Furthermore, integrated support for templates, guided workflows, and in-app assistance ensures even novice users can start producing effective, high-impact content with minimal training. New users are guided through content creation via step-by-step wizards, contextual help, and tooltips. The framework (100) is pre-equipped with sample projects and templates for common scenarios, thereby lowering the learning curve.
Said content authoring module (114) broadly comprises: an editing module (120); a three-dimensional (3D) asset visualization engine (122); a chart and data visualization module (124); an export, encryption and storage module (126); an analytic and recording module (128); an asset preview and validation module (130); and an assistance module (132).
The components of the content authoring module (114) are communicatively associated with each other.
In yet another embodiment of the present disclosure, the components of the content authoring module (114) can be used optionally according to the user requirements.
As illustrated in Figure 11a, said editing module (120) forms the core of the interactive sales presentation creation process, comprises: a scene editor and a workflow editor.
Said scene editor provides a visual canvas, where the user can set up scenes (3D environments or 360-degree spaces), placing, positioning, and configuring 3D objects, images, videos, and interactive elements.
As illustrated in Figure 11b, the workflow editor enables visual sequencing, branching logic insertion and set up automated responses to the user inputs.
In visual sequencing, each scene constructed from the scene editor can be connected or linked. While in branching logic insertion quizzes, user choices, or conditional flows were inserted using a node-based approach.
In the node-based approach, each scene, whether it's a visual scene, an information hotspot, a quiz question, or an AI interaction is represented as a block or a node. Said nodes are then connected by lines or arrows to show the flow of the end- user experience.
Said workflow editor facilitates step-through simulation and real-time validation to avoid broken flows, and supports reusable workflow snippets or templates. By using said workflow editor, even non-technical users can build sophisticated, branching, and adaptive experiences tailored to the different business scenarios, or training objectives.
Said 3D asset visualisation module (122) is configured to enable the user to assemble scenes constructed by the editing module (120), by dragging and dropping the developed 3D assets onto a virtual canvas. Said 3D assets include textures, images, videos and industry-standard 3D model formats (for example, .fbx, .glb, .obj).
Moreover, the user can position, scale, and rotate objects directly within each scene, enabling precise control over layout and composition without the need for specialized 3D platforms. This intuitive hands-on approach accelerates the scene assembly and allows the user to view immediate results, facilitating rapid prototyping and iterative improvement.
Figure 12 a and Figure 12b illustrate position 3D assets and scale up and down 3D assets by the 3D asset visualisation module (122).
Additionally, the chart and data visualisation module (124) also provide mechanisms, such as real-time rendering, asset grouping, snap-to-grid alignment, multi-level zoom, and device emulation for previewing scenes across various target devices including VR, desktop, tablet and/or the like. Through user-friendly menus, asset properties such as materials, lighting, and animations can be adjusted to produce professional-quality results with minimal training.
Said chart and data visualisation module (124) is configured to enable the user to insert interactive charts, including bar graphs, line charts, donut charts, key performance indicator (KPI) gauges, and return-on-investment (ROI) calculators, directly within 3D scenes or workflows.
Figure 13a and Figure 13b illustrate data for charts and chart visualisation respectively, in said chart and data visualisation module (124).
Said chart and data visualisation module (124) utilises a chart builder that supports both static data (entered manually or imported via CSV) and dynamic data (linked to user input, persona attributes, or external APIs). In order to ensure that the data visualizations adhere to accessibility standards and corporate branding, customisation options allow the user to specify visual attributes like colors, labels, units, legends, and tooltips.
Interactive mechanism of said chart and data visualisation module (124) allow data points within the charts to trigger the scene changes, display informational popups, or enable deeper drill-downs based on the user interactions. Other mechanisms including text alternatives, screen reader compatibility, and high contrast color palettes ensure that data visualizations are usable by all individuals, regardless of ability.
Said analytics and recording module (128) utilises an analytical dashboard to track metrics such as project creation rates, asset utilisation, workflow complexity, collaboration activity, and review cycles.
Through integration with the command centre (102), the users can easily receive real-world usage feedback for future development, including sales presentation completion rates, scene dwell times, AI assistant usage, the user engagement, and Net Promoter Score (NPS). For various stakeholder groups such as; project teams, management, and compliance officers, reports can be customized or exported.
The asset preview and validation module (130) is configured to facilitate automated validation checks for file compatibility, size limits, required metadata, licensing status, and device performance targets. This mechanism facilitates critical quality assurance that enables the user to examine the constructed 3D assets in detail before integrating them into projects.
In yet another embodiment of the present disclosure, said asset preview and validation module (130) provide interactive previews for 3D models, images, videos, and audio files ensure that the constructed 3D assets appear and perform as expected. If issues in the performance of the constructed 3D assets are detected, the user is given warnings and suggestions, thereby the user can resolve potential issues before publishing. Real-time previews support various device resolutions, lighting conditions, and interaction modes, embodying a “what you see is what you get” approach. This empowers the user to make informed decisions and maintain high standards of visual and functional quality across all sales presentation experiences.
Said assistance module (132) is configured to provide automation and creative suggestions during content creation by utilising various AI techniques. Said AI techniques may include, but not limited to:
Natural Language Processing (NLP): Understands the user instructions, drafts scene/voiceover descriptions, translates text;
Recommendation Systems: Suggests scene structures, asset usage, workflow logic based on the end-user;
Real-time AI Assistance: A multi-model large language models (LLMs)-based AI system (voice-to-voice) provides voice responses to all user queries. The AI response dynamically triggers the playback of 3D visuals;
Auto-tagging/Classification: Automatically tags or categorizes the digital assets based on the content and context;
Speech Synthesis (text-to-speech): Generates voiceover scripts from text; and
Translation/Localization AI: Provides instant multi-language support.
Figure 16 illustrates an "XRAutoFlow" mechanism of the assistance module (132). Said XRAutoflow mechanism provides an authoring environment that is configured to help users, especially non-programmers to rapidly create interactive, AI-driven XR (extended reality) experiences, such as sales presentations or virtual product demos.
Further, the assistance module (132) can recommend scene structures, draft workflow logic, suggest or auto-tag assets, generate voiceover scripts, translate content into multiple languages, and identify compliance or accessibility risks. The users can interact through chat or push-to-talk interface, asking questions or requesting help with specific content challenges.
Figure 17a and Figure 17b illustrates an AI generated multi-language flow and an audio generated respectively, through AI techniques.
In yet another embodiment of the present disclosure, all outputs generated through AI are presented as suggestions, which the creators can accept, edit, or reject, with actions logged for transparency and compliance. This capability boosts productivity, accelerates time-to-market, and helps ensure best practices—especially valuable for teams scaling sales presentation production or working in unfamiliar subject matter.
When the project is ready, said project is exported to the command centre (102) as illustrated in Figure 14a.
Said export, encryption, and storage module (126) export the project comprising all the constructed scenes, 3D assets, workflows, and metadata into a single compressed package in .onexr format, when the sales presentation is completed. Said package is encrypted using AES-256 technique, ensuring that sensitive business information is protected both at rest and in transit.
The output from the creator unit (104) is a. onexr file. Said. onexr file is a text‑based JSON container that includes:
Scene Graph: Node hierarchy defining 3D assets and their initial transforms.
Asset References: Uniform Resource Identifier (URI) for 3D meshes, textures, spatial‑audio files, and haptic patterns.
Persona Triggers: Rules mapping user profile data (industry, role, past interactions) to content branches.
Narration Scripts: Text prompts and voiceover URIs.
Routing Rules based on AI techniques: Conditions under which the machine learning model should switch scenes or present supplemental visuals.
The routing rules within the .onexr file are logic mechanisms that determine how the user navigates or is guided through an interactive sales presentation. The AI techniques used in routing rules are explained as follows.
Scene Metadata and Content Typing: Each scene in the .onexr file includes a detailed description of the digital assets and playback duration. This metadata allows what each scene offers and how it can be sequenced within a larger interactive workflow.
Dynamic Input and Personalization technique: Input requirements (such as; current answers, real-time user needs, or context) are received dynamically as the sales presentation progresses. Said dynamic input and personalization technique evaluates this data using AI techniques such as, natural language processing (NLP) or structured data analysis to determine the user’s intent, interests, or current knowledge level.
AI-driven Decision Logic (Routing Rules): The routing rules act as a decision engine, where AI techniques, such as rule-based systems, decision trees, or event recommendations powered by LLMs are used to decide which scene should play next. For example, if the user expresses interest in “fuel efficiency,” the AI techniques enable to skip unrelated content and immediately route them to the most relevant 3D animation, chart, or visual. The rules can also consider the user’s industry, job role, answers to previous questions, or even detected sentiment/emotion. The AI techniques facilitate continuously analyses the current context and inputs, making routing decisions in real time.
Dynamic Asset and Animation Selection: The AI technique selects and triggers the most appropriate digital asset and sets the correct playback duration and sequence, based on the routing rule outcomes. This ensures that the user always experiences the most relevant, engaging, and efficient path through the content—improving both engagement and learning/sales outcomes.
The routing rules may include, manual and AI driven. In manual routing, every step is fixed, whereas in the AI driven routing, AI decides the steps based on context, persona, and queries.
The AI techniques used in decision making are as follows: Natural Language Understanding (NLU) interprets the verbal (or spoken) queries of the end-user; Persona Context Matching checks the profile of the end-user from the command centre (102); Recommendation Engine suggests most relevant content; Reinforcement Learning improves routing over time by learning from engagement data; and Branching Logic decides which node to be activated next.
Every decision of the AI techniques is logged and synced back to command centre (102) for analytics and review.
Additionally, the local storage is also encrypted to comply with enterprise data security policies. Each export includes versioned manifests, checksums for integrity validation, and options for embedding or referencing assets (cloud vs. local). The export process integrates directly with the command centre (102), enabling automated upload, assignment, and tracking within the organization's workflow. In the event of any data loss, corruption, or compliance review, audit logs and version histories ensure traceability and rapid recovery.
After exportation, the project is published in the command centre (102), as illustrated in Figure 14b.
Said project management module (116) is configured to manage, lifecycle of sales presentation and the user activities from inception to delivery of said sales presentation manages. Each of the sales presentations (i.e. projects) serves as a top-level container for associated digital assets, workflows, metadata, and team assignments.
In yet another embodiment of the present disclosure, said project management module (116) enables the users to create new sales presentations from scratch, from organizational templates, or by duplicating existing projects.
Said project management module (116) allows the sales presentations to be grouped, tagged, filtered, sorted, favourited, and pinned for efficient and easy retrieval.
In yet another embodiment of the disclosure, said project management module (116) presents a project dashboard displaying visual summaries of project status, recent activity logs, and team participation metrics. Furthermore, said project management module (116) support assignment of multiple user roles to projects, including editor, reviewer, and the sales manager, with permissions managed via role-based access control (RBAC) and provides lifecycle management functions, including archiving, soft-delete, and reactivation of projects.
Figure 15a and figure 15b illustrates project sharing and creating new project, respectively, in the creator unit (104).
By centralizing all the sales presentation and team management activities, said project management module (116) enables streamlined collaboration, efficient handoffs, and transparent project tracking from inception to delivery.
Said asset management module (118) is configured to organise, store, optimise, and distribute digital assets used in content creation. Said asset management module (118) integrates directly with a creator application, ensuring that the 3D assets are available whenever and wherever they are needed. Supported 3D asset types include 3D models, images, videos, audio files, 360° media, and interactive elements and are managed with rich metadata such as tags, licensing information, usage history, and approval status.
In yet another embodiment of the present disclosure, said 3D assets can be imported through various methods such as; drag-and-drop, bulk upload, or directory synchronization. Upon importation, various automated optimization tools are configured to compress textures, convert formats, and recommend performance improvements based on target devices.
In yet another embodiment of the present disclosure, said asset management module (118) is configured with built-in version control and asset locking workflows that facilitates to prevent conflicts and support content review and approval cycles.
In yet another embodiment of the present disclosure, said asset management module (118) facilitates real-time asset synchronization across distributed teams and devices, while enforcing organization-wide permissions and usage policies.
In yet another embodiment of the present disclosure, said asset management module (118) can be integrated with external repositories, Computer-Aided Design (CAD) libraries, or Digital Asset Management (DAM) systems through a plugin API.
In yet another embodiment of the present disclosure, said asset management module (118) broadly comprises: a user authentication module (134); a versioning and collaboration module (136); an internationalisation module (138).
In yet another embodiment of the present disclosure, said asset management module (118) analyses asset usage to identify popular, unused, or duplicate items, enabling smarter content investment decisions.
The user authentication module (134) is configured to manage user authentication and access control across the framework (100). Said user authentication module (134) supports various enterprise-grade sign-in methods, including OAuth2, SSO, SAML, and integration with identity providers like Azure AD or Google Workspace.
Further, user provisioning can be done manually by bulk import, or automatically through integration with HR/CRM systems. Roles and permissions of the user are centrally managed, giving the admins fine-grained control over who can access, create, modify, export, or approve the sales presentation at the app, project, or asset level. This ensures security, auditability, and compliance in organizations of all sizes, ensuring that only authorized personnel can create, modify, export, or approve the sales presentation.
Said versioning and collaboration module (136) tracks and manages every change made to a project by various users, supporting multi-user collaboration even across distributed teams. Every edit, comment, and review of the sales presentation is tracked with a full version history, enabling the users to compare changes, revert to previous versions, and resolve conflicts. Further, real-time co-editing, presence indicators, and in-app chat supports live collaboration. Review and approval workflows helps the teams to meet quality control and regulatory requirements. Task assignment and activity feeds help the teams to stay organized and on schedule. Exportable audit trails meet the needs of compliance teams and enable transparent post-mortems or continuous improvement cycles.
Said internationalisation module (138) is configured with: a user interface, accessibility mechanisms, and internationalisation mechanisms.
Said user interface is designed for clarity and intuitive navigation, including contextual help, interactive wizards, and comprehensive onboarding processes to minimize learning curves for users of varying skill levels;
Said accessibility mechanisms comprises full keyboard navigation, screen reader compatibility, adjustable contrast settings, and support for alternative text and ARIA roles, thereby ensuring the platform is useable by individuals with disabilities;
Said internationalization mechanisms provides (or includes) multi-language user interface, support for right-to-left language rendering, dynamic content translation facilitated by an assistant module (132), and culturally aware presentation templates.
Said hub unit (106) is configured as a unified delivery platform for presenting the sales presentations to clients (i.e. end users). Further, said hub unit (106) orchestrates real-time and asynchronous collaboration, and enables seamless data flow, unified access, and coordinated control across diverse environments.
Said hub unit (106) broadly comprises: the presenter application (140); the VR application (142); and the web application (144), as illustrated in Figure 18.
During the sales presentation, the presenter application (140) operates on an external device, such as a tablet or laptop, utilized by the presenter during the sales presentation. The VR application (142) is executed on an external device, namely a VR headset, displays the sales presentation to enable the client to view the sales presentations. Furthermore, the web application (144) is employed by the sales manager from various remote locations, using external devices such as laptops or desktops, to monitor the sales presentations in real-time. The hub unit (106) is capable of retrieving digital assets directly from the asset storage module (112) and caching these assets for offline access as required.
Figure 19 illustrates live control and monitoring interface of the presenter application (140) to remotely guide, monitor and interact with the end-user’s VR experience, in real-time.
When the presenter wants to deliver the sales presentation of a product, such as heavy machinery, a heavy-duty vehicle, and/or the like, said presenter can open the presenter application (140) by logging in with their user credentials. The presenter application (140) lets them control and showcase an interactive, tailored sales presentation to the end-user/the client.
Further, the presenter can access pre-built product demos, which include 3D models, interactive content, and multimedia tailored for specific products or client needs. The presenter application (140) often synchronizes with the command centre (102) to ensure that the content updates, sales presentation assignments, and analytics are up to date. It enforces role-based permissions, therefore only authorized presenters can access and deliver the sales presentations.
In yet another embodiment of the present disclosure, the presenter application (140) can be operated in various environments such as onsite, remote, online, or offline, allowing flexibility regardless of connectivity. The presenter application (140) is communicatively associated to console connected to the VR headset, through the VR application (142). So, the presenter can control key aspects such as switching scenes or viewpoints, toggling visual mechanisms (like materials or outlines), adjusting lighting or environment settings, and enabling/disabling interaction modes.
The presenter application (140) supports both online and offline operation, enabling it to connect to the VR console either through Wi-Fi (local network) connectivity or a local router network without internet.
For both online and offline operation, the presenter application (140) and the VR headset should be connected to the same Wi-Fi for optimal communication.
The sales presentation allows the presenter to display features of the product, benefits, walkthrough use cases, and respond to client questions dynamically. The session can be recorded or analysed later through analytics tools to improve effectiveness of the sales presentation.
Mechanisms available within the presenter application (140) includes: session management; live control & monitoring (VR stream); a chat assistance interface; multi-user collaboration; analytics and recording; and accessibility and customization.
When the presenter schedules a VR sales presentation (session) with the prospective client, the content package and participant list are pre-assigned. If the session is interrupted, both parties can resume later without losing progress. Session history and engagement metrics are available for review afterward.
Session management mechanism tracks the status of each sales presentation session whether it’s scheduled, or in-progress, or completed and helps the presenters maintain order even in high-volume or multi-client environments. Integrated dashboards display sales presentation session histories, attendee lists, and key milestones, providing presenters with actionable insights at a glance. The mechanism is tightly linked with user authentication and role-based permissions, ensuring only authorized personnel can initiate or join specific sales presentation sessions. Additionally, the sales presentation session management facilitates compliance and reporting by logging attendance, participant engagement, and sales presentation session outcomes. This structure streamlines preparation, delivery, and follow-up, allowing organizations to offer consistent, high-quality interactions while retaining full oversight and auditability.
As illustrated in Figure 20, the live control and monitoring that empower the presenters to deliver guided, responsive, and interactive experiences by providing real-time visibility into the VR user’s perspective. The presenter application (140) streams a live feed of the VR user’s display, enabling the presenter to follow every action and reaction of the end users. The presenters can dynamically navigate the user through complex scenes, control the flow of content, and intervene whenever necessary to clarify information or resolve issues.
Tools within this mechanism allow the presenters to switch scenes, pause or replay sections, and highlight specific elements in the VR environment—mirroring the intuitive controls of a physical presentation. In addition, the sales managers or sales trainers can supervise/assist sales representatives/ the presenter remotely. All user interactions, prompts, and interventions are logged for post-demo session review, quality assurance, and continuous improvement.
The chat assistance interface transforms the presenter’s workflow by offering on-demand, context-aware intelligence throughout every sales presentation session. Powered by advanced language models and connected to curated knowledge bases, this assistant can answer technical product queries, suggest mechanism highlights, and recommend best practices in real time.
For example, when the presenter is asked about a technical feature in mid-demo. The presenters interact with the chat assistance interface, either typing or using voice input, and receive immediate responses tailored to the current context of the sales presentation session. The chat assistance interface can handle routine questions, surface sales scripts, propose persuasive talking points, or provide regulatory and compliance guidance as needed.
For less experienced presenters, the chat assistance interface acts as a virtual coach, ensuring accurate and consistent messaging. It can also suggest scene transitions or recommend relevant content when user interest or engagement dips. All the chat interactions are recorded for compliance, analytics, and future content improvement. By automating routine inquiries and delivering actionable insights instantly, the chat assistance interface reduces workload of the presenter, speeds up response times, and ensures that every client receives accurate, up-to-date information.
The multi-user collaboration enables multiple stakeholders to participate, observe, or contribute to immersive sales presentation sessions simultaneously. This mechanism is essential for sales teams who are working together on complex deals, the sales managers supervising new presenters, and/or trainers co-facilitating onboarding and/or workshops.
Roles such as primary presenter, observer, the sales manager, or coach facilitated through AI techniques can be assigned dynamically, with each user’s permissions clearly defined. Live annotation, where multiple users can draw, highlight, or leave notes within the sales presentation session; chat channels for behind-the-scenes coordination; and control handoff, allowing seamless transfer of the sales presentation session control when needed. Presence indicators show who is currently active, while user management tools facilitate invitations, removals, and role changes on the fly. All collaborative actions like annotations, comments, control transfers are logged for later review and quality assurance. By supporting distributed teams and enabling real-time feedback, coaching, or escalation, multi-user collaboration increases productivity, enhances training, and ensures the highest standard of delivery in every sales presentation session.
The analytics and recording mechanism measures effectiveness, compliance, and continuous improvement. During each sales presentation session, the framework automatically logs key metrics such as scene transitions, time spent per section, user interactions, and usage of chat assistance interface. This data is aggregated into intuitive dashboards that presenters and the sales managers can access in real time or after the sales presentation session. Also, each sales presentation session can be recorded (video, audio, screen capture) for replay, quality assurance, performance review, and regulatory compliance.
Figure 21a and Figure 21b illustrate analytics and sales presentation session recording, respectively, in the presenter application (140).
Feedback from participants, including poll results, ratings, or open-ended comments, is also captured and linked to session analytics. Recording capabilities ensure transparency and accountability, supporting internal training, certification, or client reporting requirements. All analytics and recordings are stored securely, with role-based access controls and compliance mechanisms such as export for legal review or audit. This comprehensive visibility into every aspect of sales presentation session delivery transforms data into actionable intelligence for business growth and operational excellence.
The accessibility and customization mechanisms provide accessibility tools such as; screen readers, keyboard navigation, font scaling, high-contrast color schemes, and multi-language interfaces. The presenters with visual, auditory, or motor impairments can configure the presenter application (140) for optimal usability, ensuring compliance with WCAG 2.1 AA and other global standards.
Customization options allow the user to personalize dashboards, notifications, and control layouts, as well as localize the interface for different languages or regions. Organizations can brand the presenter application (140) with custom logos, color palettes, and workflows, providing a seamless extension of their corporate identity.
The VR application (142) allows end-users and sales teams to experience immersive, interactive 3D presentations and training in the field, even offline. The VR headset comprises: a renderer; an input listener; and a multi-sensory subsystem.
The renderer loads scene graph, applies Physically Based Rendering (PBR) materials, and streams 360° video. The input listener captures gaze vectors, gesture intents, hand‑tracking or controller events. Whereas, the multi‑sensory subsystem dispatches haptic commands to gloves/vests and spatial‑audio cues.
Mechanisms available within the VR application (142) includes: user authentication; dashboard / product listing; collaboration & feedback; AI voice assistant; end-user interaction / accessibility (VR hand gesture/controller); auto animation generator; and real-time environment change.
The user authentication mechanism serves as a security gatekeeper, ensuring that only authorized user (presenter) can login to the VR application (142), as illustrated in Figure 22. For this purpose, robust enterprise protocols such as OAuth2, SSO, or device-bound passcodes are used. The user authentication mechanism supports online authentication for real-time credential validation and offline fallback via securely cached tokens, accommodating remote or field deployments without network connectivity.
The user roles and permissions are enforced at login, controlling what content, mechanisms, and the sales presentation each user can access. This granular control is essential for regulated industries or large organizations with multiple user tiers (e.g., sales, trainers, the sales managers, customers). Audit trails track all login attempts and sales presentation session durations, supporting compliance and incident review. For higher security, two-factor authentication or biometric checks may be integrated.
The dashboard and product listing offers a clear and organized overview of all available immersive experiences, as illustrated in Figure 23. When authenticated, the users are presented with a visually rich interface showcasing assigned content packages, categorized by use case, product, training module, or customer persona. Interactive thumbnails and metadata (such as package descriptions, last update, and progress status) make navigation intuitive, even for first-time VR users.
Figure 24a and Figure 24b illustrate the VR application (142) interface and the corresponding presenter application (140) interface, respectively, during a sales presentation to enable real-time review, editing, and confirmation of the requirements. This coordinated workflow allows both parties to collaboratively ensure data accuracy and enables the framework (100) to deliver a highly personalized, needs-based sales presentation.
The dashboard supports search, filters, and favourites for quick access to frequently used assets. Notifications highlight new assignments, updates, or expiring content. For enterprises, dashboards can be personalized to reflect organizational branding or user roles, ensuring that everyone sees the most relevant content first.
The product listings are dynamically updated based on real-time assignments from the command centre (102) or the creator unit (104), ensuring only up-to-date, approved experiences are accessible. By streamlining discovery and selection, the dashboard ensures the sales presentation sessions start quickly, reduces onboarding friction, and promotes broader adoption of VR content across the business.
The collaboration and feedback mechanism in the VR Application (142) bridges the gap between immersive experiences and real-world teamwork. The end-users can share their session live with a presenter, trainer, or remote collaborator, enabling hands-on guidance, troubleshooting, or joint exploration. Real-time chat, voice, or even live annotation overlays support rich interaction and problem-solving. The system records every collaboration event such as scene changes, presenter interventions, or user questions enabling detailed analytics and continuous improvement. Integrated feedback forms, ratings, polls, or voice memos allow users to share their impressions and suggestions immediately after a session, capturing insights while they’re fresh. Feedback can trigger notifications to content authors or the sales managers for follow-up or iterative improvement.
The voice assistance mechanism enabled through AI techniques provides context-sensitive, on-demand virtual guidance, embedded directly in the VR Application (142). Said voice assistance mechanism can be activated through natural language either by push-to-talk or always-on listening, as illustrated in Figure 25.
Figure 26 illustrates chat functionality of the voice assistance mechanism, where the voice assistance mechanism answers technical questions, provide just-in-time coaching during live sessions. It can adapt responses based on user profile, session context, or detected behaviour (e.g., if a user seems stuck or disengaged). The voice assistance mechanism can also suggest scene transitions, propose relevant talking points, or escalate to a human presenter when needed. Knowledge base of the voice assistance mechanism is kept updated with the latest product, compliance, and sales information, ensuring accuracy and consistency. For training use cases, the voice assistance mechanism can quiz users, offer hints, or summarize key learning points and every interaction is logged for analytics and compliance.
The end-user interaction and accessibility mechanisms support intuitive hand gestures, controller input (such as, Oculus, Vive, Pico, etc.), gaze-based selection, and voice commands, as illustrated in Figure 26. These controls allow the end-users to move through scenes, select hotspots, manipulate objects, and trigger interactive elements seamlessly.
Accessibility options include customizable text size, high-contrast color schemes, multi-language audio or subtitles, and alternative input modes for users with motor or sensory impairments. An onboarding tutorial and contextual help overlays guide new users through basic controls and best practices, reducing anxiety and errors. The interface complies with WCAG 2.1 AA and other global standards, ensuring equitable access.
The auto animation generator mechanism is configured to create animations automatically, with the help of AI techniques, based on queries raised by the end-user, during the sales presentation session, in real-time, thereby making the session more interactive and realistic.
The workflow of the auto animation generator mechanism is explained with the help of Figure 30. When the end-user asks a query, said auto animation generator converts said query into an action sheet (e.g. JSON file). For example, for the query “How does this excavator machine work in heavy material?”, the action sheet may have the instructions, such as Boom-stick rotates 40°; Start digging motion; Move excavator left 30°; and/or the like.
Based on the action sheet, said auto animation generator creates necessary animation, in real-time, without any pre-loaded video and/or manual setup. Subsequently, the created animation is applied to the sales presentation that is already running, in real-time.
In yet another embodiment of the present disclosure, the AI techniques used in said auto animation generator mechanism and the creator unit (104) includes: Natural Language Processing (NLP) and Large Language Models (LLMs) for parsing the queries and extracting the intent; Scene and Workflow Generation Engine for mapping: parsed queries with workflow nod and action nod with keyframe data; Keyframe Animation Generation for creating keyframe sequences for 3D assets (camera moves, rotations, scaling) based on AI-produced JSON action sheets; AI-Driven Asset Binding for recommending 3D models to be animated (e.g., excavator boom, stick, bucket) by tagging assets during ingestion with AI classification models; Adaptive Workflow Logic for allowing real-time animation branching, and selecting or modifying animations dynamically based on customer queries; Speech-to-Animation Linkage for transcribing and passing the push-to-talk queries into NLP and/or AI pipeline, and triggering an auto-generated animation sheet at runtime; and Real-Time Runtime Animation Engine for executing AI-generated animation JSON through API services, at runtime, and enabling on-the-fly animation playback without pre-rendering.
In yet another embodiment of the present disclosure, the disclosed framework (100) can mix workflow-driven and AI-driven animations, as per requirement.
The real-time environment change mechanism is configured to change the environment of the sales presentation session, in real-time, based on queries raised by the end-user.
For example, if the end-user asks “How does this excavator work in hard weather conditions?”, the environment of the sales presentation session is changed to stormy, dark, and heavy-weather environment, and showing the machine in action under those conditions, by the real-time environment change mechanism.
Said real-time environment change mechanism works based on Dynamic Environment Framework, which allows to: understand the context of the end-user’s query; switch the environment of the sales presentation, in real-time, to match the context (e.g. sunny, rainy, day, night, city, construction site, etc.); and play animations and visual effects (e.g. rain, fog, dust, or lighting changes) to make the experience feel real and immersive.
The web application (144) enables distributed team members, the sales managers, or remote experts to join, observe, and participate in sessions by providing chat, feedback, and real-time analytics.
Various mechanisms within the web application (144) includes: session participation; collaboration tools; analytics; accessibility; and live streaming.
The session participation allows remote users including the sales managers, team members, clients, or prospects to actively join ongoing immersive sessions, as illustrated in Figure 27.
For example, when the sales manager wants to join the sales presentation session from a remote location, his/her access is securely managed, often through OTPs, SSO, passcodes, or invitation links, ensuring that only authorized participants can enter a session. The web interface is designed for clarity, allowing participants to follow the session’s flow, view live screen streams, and receive contextual updates as the session progresses. Participants can interact via Q&A, polls, or chat, fostering active engagement and making remote attendees feel as involved as those physically present. Session participation also supports different user roles, with permissions tailored to each role’s requirements.
The collaboration tools enable participants to contribute, interact, and communicate seamlessly during sessions. Key mechanisms include live chat for instant messaging, Q&A panels for structured question submission, and annotation tools that allow users to highlight or mark up the content in real time, as illustrated in Figure 28. Advanced options might also include “raise hand” mechanisms, screen sharing, and collaborative note-taking, where all participants can add or review shared notes linked to specific moments in the session. These tools help bridge the gap between in-person and remote engagement, fostering genuine teamwork regardless of location. The system often supports private chat for backstage coordination between presenters and public channels for group discussion. Downloadable reports, session notes, and tagged moments enable follow-up actions and post-session reviews.
Additionally, the collaboration suite is designed to ensure that even first-time users can contribute meaningfully without technical hurdles.
The analytics mechanisms in the web application (144) provide actionable insights into session engagement, participant behaviour, and content effectiveness. During and after each session, the system captures real-time analytic data such as; who attended the session, how long they participated, which scenes or features held their attention, which questions were asked, and how polls or quizzes were answered, as illustrated in Figure 29. This data is displayed in intuitive dashboards and can be filtered by role, session, or timeframe.
The sales managers can view engagement heatmaps, identify the most and least effective content, and monitor user trends over time. Analytics also extend to presenter performance, capturing metrics like response speed, time spent on topics, and follow-up actions taken. For compliance and continuous improvement, every interaction is timestamped and stored securely. Data can be exported in standard formats for integration with BI tools or CRM platforms, supporting wider business analysis and ROI tracking. Real-time analytics provide presenters with on-the-fly feedback to adjust delivery or address issues, while post-session analytics inform future strategy, content updates, and targeted coaching.
The accessibility mechanisms ensures that all users regardless of ability can participate fully in immersive sessions. The interface is built according to WCAG 2.1 AA standards and supports screen readers, high-contrast themes, keyboard-only navigation, scalable fonts, and Accessible Rich Internet Applications (ARIA) labels for assistive technology compatibility.
For example, when a user with low vision joins, activates high-contrast mode and enlarged text. Multi-language support and right-to-left (RTL) layout options enable global teams and clients to access content in their preferred language and script. Tooltips, clear icons, and responsive layouts enhance usability across devices, including tablets and smartphones. The system also allows presenters to provide alternative text for visuals and closed captions for video or audio content, making sessions inclusive for users with hearing or visual impairments.
The live streaming mechanism in the web application (144) is a transformative mechanism that brings immersive experiences to remote audiences (for example, the sales manager) to live stream the presenter as well as the VR screens in real time. This capability mirrors the VR user’s perspective or the presenter’s dashboard directly into any modern web browser, with minimal latency and high fidelity. The sales manager can switch between different views (e.g., VR screen, presenter’s screen), ensuring they always see the most relevant content. The streaming engine adapts to varying network conditions, providing smooth playback even in less-than-ideal environments.
The presenters can use live streaming to showcase products, conduct training, or provide technical support without geographic constraints. Live streaming creates a sense of presence, allowing them to follow along, react, and interact as if they were in the same room. Live streaming with collaboration and analytics tools, live streaming supports truly interactive, distributed sales presentation sessions expanding reach, improving customer engagement, and supporting modern hybrid workflows. The sales presentation recordings can also be generated for those who missed the live event or need to review key moments, further increasing the value of each streamed session.
A shared and integration mechanism integrates the presenter application (140), the VR application (144) and the web application (144). These include secure authentication and Single Sign-On (SSO) across all the hub applications, robust role-based access control, encrypted data transmission, unified analytics/reporting, organization-wide notifications, audit trails, and automated content sync with the command centre (102) or the creator unit (104). The architecture is designed for extensibility, supporting plugins, API/webhook integrations, multi-language content, and multi-device session handoff. Edge case handling (e.g., offline operation, conflict resolution, error recovery) is managed seamlessly in the background, ensuring reliability and resilience. These shared mechanisms ensure a unified, secure, and high-performance sales presentation experience across different external devices.
The disclosed framework (100) offers at least the following advantages: is to use; is cost effective; offers unified asset management, multi‑sensory feedback, low/no code-paradigm enables beginners to create content; CRM integration, presenter control mechanisms, and delivers next‑generation VR sales experiences by enhancing end-user engagement, and personalisation.
Implementation of the disclosure can involve performing or completing selected tasks manually, automatically, or a combination thereof. Further, according to actual instrumentation of the disclosure, several selected tasks could be implemented, by hardware, by software, by firmware, or by a combination thereof, using an operating system. For example, as software, selected tasks, according to the disclosure, could be implemented, as a plurality of software instructions being executed, by a computer, using any suitable operating system.
A person skilled in the art will appreciate the fact that the mould, and its various components, may be made of any suitable materials known in the art. Likewise, a person skilled in the art will also appreciate the fact that the configurations of the mould, and its various components, may be varied, based on requirements.
It will be apparent to a person skilled in the art that the above description is for illustrative purposes only and should not be considered as limiting. Various modifications, additions, alterations, and improvements, without deviating from the spirit and the scope of the disclosure, may be made, by a person skilled in the art. Such modifications, additions, alterations, and improvements, should be construed as being within the scope of this disclosure.
LIST OF REFERENCE NUMERALS
100 – Framework
102 - Command Centre
104 –Creator Unit
106 - Hub Unit
108 - Data Storage Module
110 – Audit and Event Logs Storage Module
112 – Asset Storage Module
114 – Content Authoring Module
116 - Project Management Module
118 - Asset Management Module
120 – Editing Module
122 – 3D Asset Visualization Engine
124 – Chart and Data Visualization Module
126 - Export, Encryption and Storage Module
128 - Analytics and Reporting Module
130 - Asset Preview and Validation Module
132 –Assistance Module
134 - User Authentication Module
136 - Versioning and Collaboration Module
138 - Internationalisation Module
140 – Presenter Application
142 – VR Application
144 – Web Application
, Claims:1. A framework (100) for personalised sales presentations, comprising:
a command centre (102) that is a centralised platform for monitoring, managing, and coordinating operations of the framework (100);
a creator unit (104) that is a content creation platform for a user, for creating a new personalised sales presentation; and
a hub unit (106) that is a unified delivery platform for presenting the sales presentation to a client, with:
said command centre, said creator unit (104), and said hub unit (106) being communicably associated with each other; and
operations of said creator unit (104) and said hub unit (106) being managed, and controlled, by the command centre (102).
2. The framework (100) for personalised sales presentations, as claimed in claim 1, wherein: said command centre (102) comprises:
a data storage module (108) that stores: security, compliance, and audit data; user-role management data; organization and account management data; and customer relationship management and integration layer data;
an audit and logs storage module (110) that provides an immutable and tamper-evident record of every action within said command centre (102); and
an asset storage module (112) that is a digital warehouse to securely store, organize, and deliver digital assets.
3. The framework (100) for personalised sales presentations, as claimed in claim 2, wherein: said security, compliance and audit data includes multi-factor authentication, role-based access control, encryption at rest and in transit, vulnerability scanning, and proactive security posture reviews.
4. The framework (100) for personalised sales presentations, as claimed in claim 2, wherein: said user-role management data includes user profiles and role category of each user.
5. The framework (100) for personalised sales presentations, as claimed in claim 3, wherein: user access is limited by organization, team, project, region, or individual asset.
6. The framework (100) for personalised sales presentations, as claimed in claim 2, wherein: said organization and account management data includes: data related to registering and managing clients, data related to an admin’s register, team memberships and the user account status and deactivation records.
7. The framework (100) for personalised sales presentations, as claimed in claim 2, wherein: mechanism of said audit and event log storage module (110) includes:
client subdomain and branding that allow each client to have a distinct presence within the command centre (102);
sales pitch assignment workflow that is configured to streamline the distribution and management of sales content across teams and users; and
admin tools that enable: to manage users, resources, services, security, and system configurations remotely, and content and asset oversight.
8. The framework (100) for personalised sales presentations, as claimed in claim 2, wherein: the digital assets include: 3D models, videos, images, audio files, the content packages, analytics exports, audit logs, hotspots, and logic nodes, animation.
9. The framework (100) for personalised sales presentations, as claimed in claim 1, wherein: said creator unit (104) comprises:
a content authoring module (114) that provides guided content authoring flow to assemble digital assets, with a low code/no-code paradigm, and enables the user to create and assemble interactive sales presentations;
a project management module (116) that manages lifecycle of the sales presentation, and the user’s activities, from inception to delivery of said sales presentation; and
an asset management module (118) that organises, stores, optimises, and distributes digital assets.
10. The framework (100) for personalised sales presentations, as claimed in claim 9, wherein: said content authoring module (114) comprises:
an editing module (120) that comprises: a scene editor that provides a visual canvas, for setting up scenes, placing, positioning, and configuring 3D objects, images, videos, and interactive elements; and a workflow editor that enables visual sequencing, branching logic insertion and set up automated responses to the user inputs;
a three-dimensional (3D) asset visualization engine (122) that is configured to enable the user to assemble scenes constructed by the editing module (120), by dragging and dropping the developed 3D assets onto a virtual canvas;
a chart and data visualization module (124) that provides real-time rendering, asset grouping, snap-to-grid alignment, multi-level zoom, and device emulation for previewing scenes;
an export, encryption and storage module (126) that export all the constructed scenes, 3D assets, workflows, and metadata into a single compressed package;
an analytic and recording module (128) that tracks project creation rates, asset utilisation, workflow complexity, collaboration activity, and review cycles;
an asset preview and validation module (130) that is configured to facilitate automated validation checks for: file compatibility, size limits, metadata, licensing status, and device performance targets; and
an assistance module (132) that is configured to provide automation and creative suggestions during content creation.
11. The framework (100) for personalised sales presentations, as claimed in claim 9, wherein: said asset management module (118) comprises:
a user authentication module (134) that is configured to manage user authentication and access control across the framework (100); a versioning and collaboration module (136) that tracks and manages every change made to a project by the users; and an internationalisation module (138) that provides: multi-language user interface, support for right-to-left language rendering, dynamic content translation facilitated by an assistant module (132), and culturally aware presentation templates.
12. The framework (100) for personalised sales presentations, as claimed in claim 1, wherein: said hub unit (106) comprises:
a presenter application (140) that delivers sales presentations, with: said presenter application (140) being operated on a tablet or laptop; and said presenter application (140) supporting both online and offline operations;
a virtual reality application (142) that displays the sales presentations to enable the client to view the sales presentations, with: said virtual reality application (142) being operated on a virtual reality headset; and
a web application (144) that monitors the sales presentations by a sales manager, remotely, in real-time, with: said web application (144) being operated on the tablet or laptop.
13. The framework (100) for personalised sales presentations, as claimed in claim 12, wherein: mechanisms of said presenter application (140) include: session management; live control and monitoring; a chat assistance interface; multi-user collaboration; analytics and recording; and accessibility and customization.
14. The framework (100) for personalised sales presentations, as claimed in claim 12, wherein: mechanisms of said virtual reality application (142) include: user authentication; dashboard/product listing; collaboration and feedback; AI voice assistant; end-user interaction/accessibility; auto animation generator; and real-time environment change.
15. The framework (100) for personalised sales presentations, as claimed in claim 12, wherein: mechanisms of said web application (144) include: session participation; collaboration tools; analytics; accessibility; and live streaming.
16. The framework (100) for personalised sales presentations, as claimed in claim 1 or claim 9 or claim 10, wherein: the sales presentation created by the creator unit (104) is in .onexr file format, with: said .onexr file including: Scene Graph, Asset References, Persona Triggers, Narration Scripts, and Routing Rules.
17. The framework (100) for personalised sales presentations, as claimed in claim 12, wherein: said routing rules decide a scene to play next.
18. A framework (100) for personalised sales presentations, comprising:
a command centre (102) that is a centralised platform for monitoring, managing, and coordinating operations of the framework (100), said command centre (102) comprising:
a data storage module (108) that stores: security, compliance, and audit data; user-role management data; organization and account management data; and customer relationship management and integration layer data;
an audit and logs storage module (110) that provides an immutable and tamper-evident record of every action within said command centre (102); and
an asset storage module (112) that is a digital warehouse to securely store, organize, and deliver digital assets;
a creator unit (104) that is a content creation platform for a user, for creating a new personalised sales presentation, said creator unit (104) comprising:
a content authoring module (114) that provides guided content authoring flow to assemble digital assets, with a low code/no-code paradigm;
a project management module (116) that manages lifecycle of the sales presentation, and the user’s activities, from inception to delivery of said sales presentation; and
an asset management module (118) that organises, stores, optimises, and distributes digital assets; and
a hub unit (106) that is a unified delivery platform for presenting the sales presentation to a client, said hub unit (106) comprising:
a presenter application (140) that delivers sales presentations, with: said presenter application (140) being operated on a tablet or laptop; and said presenter application (140) supporting both online and offline operations;
a virtual reality application (142) that displays the sales presentations to enable the client to view the sales presentations and provided with an auto animation generator and a real-time environment change mechanism, with: said virtual reality application (142) being operated on a virtual reality headset; and
a web application (144) that monitors the sales presentations by a sales manager, remotely, in real-time, with: said web application (144) being operated on the tablet or laptop, with:
said command centre, said creator unit (104), and said hub unit (106) being communicably associated with each other; and
operations of said creator unit (104) and said hub unit (106) being managed, and controlled, by the command centre (102).
19. The framework (100) for personalised sales presentations, as claimed in claim 18, wherein: mechanism of said audit and event log storage module (110) include:
client subdomain and branding that allow each client to have a distinct presence within the command centre (102);
sales pitch assignment workflow that is configured to streamline the distribution and management of sales content across teams and users; and
admin tools that enable: to manage users, resources, services, security, and system configurations remotely, and content and asset oversight.
20. The framework (100) for personalised sales presentations, as claimed in claim 18, wherein: the digital assets include: 3D models, videos, images, audio files, the content packages, analytics exports, audit logs, hotspots, and logic nodes, animation.
21. The framework (100) for personalised sales presentations, as claimed in claim 18, wherein: said content authoring module (114) comprises:
an editing module (120) that comprises: a scene editor that provides a visual canvas, where the user can set up scenes, placing, positioning, and configuring 3D objects, images, videos, and interactive elements; and a workflow editor that enables visual sequencing, branching logic insertion and set up automated responses to the user inputs;
a three-dimensional (3D) asset visualization engine (122) that is configured to enable the user to assemble scenes constructed by the editing module (120), by dragging and dropping the developed 3D assets onto a virtual canvas;
a chart and data visualization module (124) that provides real-time rendering, asset grouping, snap-to-grid alignment, multi-level zoom, and device emulation for previewing scenes;
an export, encryption and storage module (126) that export all the constructed scenes, 3D assets, workflows, and metadata into a single compressed package;
an analytic and recording module (128) that tracks project creation rates, asset utilisation, workflow complexity, collaboration activity, and review cycles;
an asset preview and validation module (130) that is configured to facilitate automated validation checks for: file compatibility, size limits, metadata, licensing status, and device performance targets; and
an assistance module (132) that is configured to provide automation and creative suggestions during content creation.
22. The framework (100) for personalised sales presentations, as claimed in claim 18, wherein: said asset management module (118) comprises:
a user authentication module (134) that is configured to manage user authentication and access control across the framework (100); a versioning and collaboration module (136) that tracks and manages every change made to a project by the users; and an internationalisation module (138) that provides: multi-language user interface, support for right-to-left language rendering, dynamic content translation facilitated by an assistant module (132), and culturally aware presentation templates.
23. The framework (100) for personalised sales presentations, as claimed in claim 18, wherein: mechanisms of said presenter application (140) include: session management; live control and monitoring; a chat assistance interface; multi-user collaboration; analytics and recording; and accessibility and customization.
24. The framework (100) for personalised sales presentations, as claimed in claim 18, wherein: mechanisms of said virtual reality application (142) include: user authentication; dashboard/product listing; collaboration and feedback; AI voice assistant; and end-user interaction/accessibility.
25. The framework (100) for personalised sales presentations, as claimed in claim 18, wherein: mechanisms of said web application (144) include: session participation; collaboration tools; analytics; accessibility; and live streaming.
26. The framework (100) for personalised sales presentations, as claimed in claim 18, wherein: the sales presentation created by the creator unit (104) is in .onexr file format, with: said .onexr file including: Scene Graph, Asset References, Persona Triggers, Narration Scripts, and Routing Rules.
27. The framework (100) for personalised sales presentations, as claimed in claim 26, wherein: said routing rules decide a scene to play next.
| # | Name | Date |
|---|---|---|
| 1 | 202541093328-REQUEST FOR EXAMINATION (FORM-18) [29-09-2025(online)].pdf | 2025-09-29 |
| 2 | 202541093328-REQUEST FOR EARLY PUBLICATION(FORM-9) [29-09-2025(online)].pdf | 2025-09-29 |
| 3 | 202541093328-FORM-9 [29-09-2025(online)].pdf | 2025-09-29 |
| 4 | 202541093328-FORM-5 [29-09-2025(online)].pdf | 2025-09-29 |
| 5 | 202541093328-FORM FOR SMALL ENTITY(FORM-28) [29-09-2025(online)].pdf | 2025-09-29 |
| 6 | 202541093328-FORM FOR SMALL ENTITY [29-09-2025(online)].pdf | 2025-09-29 |
| 7 | 202541093328-FORM 3 [29-09-2025(online)].pdf | 2025-09-29 |
| 8 | 202541093328-FORM 18 [29-09-2025(online)].pdf | 2025-09-29 |
| 9 | 202541093328-FORM 1 [29-09-2025(online)].pdf | 2025-09-29 |
| 10 | 202541093328-FIGURE OF ABSTRACT [29-09-2025(online)].pdf | 2025-09-29 |
| 11 | 202541093328-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-09-2025(online)].pdf | 2025-09-29 |
| 12 | 202541093328-EVIDENCE FOR REGISTRATION UNDER SSI [29-09-2025(online)].pdf | 2025-09-29 |
| 13 | 202541093328-DRAWINGS [29-09-2025(online)].pdf | 2025-09-29 |
| 14 | 202541093328-DECLARATION OF INVENTORSHIP (FORM 5) [29-09-2025(online)].pdf | 2025-09-29 |
| 15 | 202541093328-COMPLETE SPECIFICATION [29-09-2025(online)].pdf | 2025-09-29 |