Abstract: The invention describes a system and method for discovery, monitoring, and utilization of endpoint ecosystems for dynamic tool generation in LLM-based inference. The system provides for continuously discovering, analyzing, and updating a comprehensive knowledge graph of endpoint components (including applications, packages, tools, web-links, and APIs) based on system activity, network behavior, and installation metadata. It dynamically maps relationships and monitors changes to intelligently determine and generate the most appropriate endpoints as tools for task execution by a local or remote large language model (LLM). The system uses hybrid telemetry, semantic context extraction, graph reasoning, and multi-level orchestration to ensure full utilization of the endpoint ecosystem during inferencing. The system comprises of Endpoint Discovery Engine module, Activity Telemetry Aggregator module, Web Request Profiler module, Knowledge Graph Constructor module , Change Detection & Delta Engine module , Semantic Intent Analyzer module , Tool Orchestration Engine module and LLM Tool Adapter module.
Description:FIELD OF THE INVENTION
The present invention relates to intelligent systems and AI toolchains. More specifically it refers to a system and method for continuously discovering, analyzing, and updating a comprehensive knowledge graph of endpoint components (including applications, packages, tools, web-links, and APIs) based on system activity, network behavior, and installation metadata.
BACKGROUND OF THE INVENTION
Large Language Models, such as those developed by OpenAI and similar entities, are increasingly being used to perform complex computational and decision-making tasks. Tool integration is typically carried out manually or via static registration mechanisms such as OpenAI specifications or predefined interfaces which often require developers to explicitly define each tool’s interface parameters, operational constraints, and intended use.
Some existing frameworks fail to incorporate system telemetry such as historical usage patterns, resource consumption metrics, and correlations of concurrent activities to optimize tool discovery and strategy formulation. As a result, they suffer from inefficiencies like redundant tool invocations, suboptimal resource utilization, and an inability to fully leverage the capabilities available across both local and networked computing environments.
The present invention provides a system and method for intelligent discovery, monitoring, and utilization of endpoint ecosystems to enable dynamic tool generation and orchestration during large language model inference. It leverages both static and dynamic system-level analysis techniques to continuously identify and monitor all executable endpoints. The system applies contextual and semantic reasoning to interpret the user’s intended operation and resolve the most relevant set of tools or endpoint pathways from a knowledge graph, thereby enabling adaptive and contextually appropriate tool invocation.
Prior Art:
For instance, US12039263B1 this system enhances generative artificial intelligence by using generative models to supply external information to a pre-trained LLM for answering queries. To respond effectively, user queries are modified and augmented with additional relevant information, then divided into multiple sub-queries for parallel processing. The results are subsequently combined into a single response. However, the system lacks the capability to ingest documents, text, or data for the LLM to process. It also does not introspect system capabilities, lacks a knowledge graph of available tools, and does not support redundancy elimination or tool clustering.
US20250110807A1, A dynamic API discovery system for conversational interfaces is described, wherein a processor and memory execute instructions to generate prompts guiding the user, receive user conversation requests, select one or more relevant APIs based on the request and an assigned guidance policy, and compute a semantic alignment score to determine the relevance of the selected APIs. However, its scope is limited to surface-level conversational topic management and policy-based API matching. In contrast, the present invention offers a comprehensive framework for the intelligent discovery, monitoring, and utilization of endpoint ecosystems. It leverages system-level telemetry data such as process execution patterns, web request profiles, and resource consumption metrics to dynamically update a multi-layered knowledge graph. A semantic intent analyser is employed to parse user prompts and map them to corresponding graph nodes representing executable tools.
US12061970B1 the system discloses a method for responding to natural language queries across data sources by utilizing a model orchestration LLM that considers user attributes such as roles and security parameters. Based on these attributes, the system generates context-aware queries, invokes machine learning agents, and produces a filtered natural language response. However, the system does not construct or maintain a knowledge graph, lacks telemetry-based discovery of tools, and does not dynamically generate invocation schemas or tool wrappers that are consumable by LLMs.
While existing systems make efforts to infer user intent and enhance enterprise data usability, they fall short of delivering a comprehensive framework for optimizing tool discovery and usage strategies. These systems typically lack both static and dynamic system-level analysis techniques necessary to identify executable endpoints, as well as contextual and semantic reasoning required for context-aware tool invocation. Moreover, they do not incorporate critical components such as time-weighted graphs to track behavioural patterns, mechanisms for redundancy elimination, or clustering of redundant endpoints.
DEFINITIONS
The expression “system” used hereinafter in this specification refers to an ecosystem comprising, but is not limited to a system with a user, input and output devices, processing unit, plurality of mobile devices, a mobile device-based application to identify dependencies and relationships between diverse businesses, a visualization platform, and output; and is extended to computing systems like mobile, laptops, computers, PCs, etc.
The expression “input unit” used hereinafter in this specification refers to, but is not limited to, mobile, laptops, computers, PCs, keyboards, mouse, pen drives or drives.
The expression “output unit” used hereinafter in this specification refers to, but is not limited to, an onboard output device, a user interface (UI), a display kit, a local display, a screen, a dashboard, or a visualization platform enabling the user to visualize, observe or analyze any data or scores provided by the system.
The expression “processing unit” refers to, but is not limited to, a processor of at least one computing device that optimizes the system.
The expression “large language model (LLM)” used hereinafter in this specification refers to a type of machine learning model designed for natural language processing tasks such as language generation. LLMs are language models with many parameters, and are trained with self-supervised learning on a vast amount of text.
The expression “Endpoint Ecosystem” used hereinafter in this specification refers to all the usable tools, applications, scripts, packages, and web links present in a computing environment.
The expression “Endpoint Discovery Engine (EDE)” used hereinafter in this specification refers to a component that scans and finds all executable tools, services, and applications using both file-based and activity-based methods.
The expression “Activity Telemetry Aggregator (ATA)” used hereinafter in this specification refers to a system that collects real-time usage data (run-time behaviour, memory usage) of tools and applications running on a computer.
The expression “Web Request Profiler (WRP)” used hereinafter in this specification refers to tool that monitors and analyses outgoing web requests from applications to understand how they interact with online services or APIs.
The expression “Knowledge Graph Constructor (KGC)” used hereinafter in this specification refers to a component that builds and updates the knowledge graph by adding nodes and edges.
The expression “Change Detection & Delta Engine (CDDE)” used hereinafter in this specification refers to a system that tracks changes like tool installations or removals and updates only the affected parts of the knowledge graph.
The expression “Semantic Intent Analyzer (SIA)” used hereinafter in this specification refers to an AI component that interprets what the user wants to do and converts it into a form that can be matched with available tools.
The expression “Tool Orchestration Engine (TOE)” used hereinafter in this specification refers to a smart engine that selects, ranks, and prepares the most suitable tools from the knowledge graph to execute the user's intent.
The expression “Telemetry” used hereinafter in this specification refers to the Data automatically collected from a system about how its components behave and interact during operation.
OBJECTS OF THE INVENTION
The primary object of the present invention is to provide a system and method for discovery, monitoring, and utilization of endpoint ecosystems for dynamic tool generation in LLM-based inference.
Another object of the invention is to enable intelligent discovery and classification of system endpoints (e.g., apps, packages, APIs [Application Programming Interface]) using static and dynamic analysis.
Another object of the invention is to build and update a knowledge graph capturing endpoint relationships, usage, and metadata for contextual optimization.
Another object of the invention is to detect system changes (installations, deletions, updates) and update only affected parts of the knowledge graph.
Another object of the invention is to extract user intent from natural language and match it with relevant system tools via semantic analysis.
Another object of the invention is to dynamically select and invoke suitable tools based on usage, similarity, availability, and recent activity.
Another object of the invention is to provide a format selected tools for LLM use, such as JSON schemas or OpenAI/Long Chain compatible definitions.
SUMMARY
Before the present invention is described, it is to be understood that the present invention is not limited to specific methodologies and materials described, as these may vary as per the person skilled in the art. It is also to be understood that the terminology used in the description is for the purpose of describing the particular embodiments only and is not intended to limit the scope of the present invention.
The present invention describes a system and method for discovery, monitoring, and utilization of endpoint ecosystems for dynamic tool generation in LLM-based inference. The system dynamically maps relationships and monitors changes to intelligently determine and generate the most appropriate endpoints as tools for task execution by a local or remote large language model.
The system comprises of an input unit , a processing unit and output unit , wherein the processing unit further comprises of endpoint discovery engine module, activity telemetry aggregator module, web request profiler module, knowledge graph constructor module , change detection & delta engine module , semantic intent analyser module , tool orchestration engine module and LLM tool adapter module;
According to an aspect of the present invention, the method for discovery, monitoring, and utilization of endpoint ecosystems for dynamic tool generation in LLM-based inference comprises the steps of initializing a system environment to prepare components for endpoint discovery and telemetry collection; executing the Endpoint Discovery Engine to identify and enumerate executables, services, packages, and extensions within a computing environment using static inspection of file systems, registries, and manifests, and dynamic monitoring of process creation, installation events, and plugin loads; aggregating telemetry data by means of the Activity Telemetry Aggregator to capture runtime signals including process activity, resource utilization, execution duration, and parent–child process relationships from operating system logs and monitoring interfaces; profiling web requests by means of Web Request Profiler to intercept outbound traffic, associate such traffic with originating processes, and extract request-level metadata including endpoints, authentication headers, payload schemas, and timing information; constructing the Knowledge Graph comprising nodes representing applications, tools, scripts, or web endpoints and edges representing dependencies, co-usage, or functional similarity, the graph further enriched with metadata, temporal overlays, and semantic tags; detecting changes in the environment by means of the Change Detection and Delta Engine, including installation, removal, update, or modification of tools, and propagating incremental updates to the knowledge graph using snapshot hashing and delta compression; analysing semantic intent by means of Semantic Intent Analyzer to receive natural language input, extract task intent using machine learning models, and map said intent to candidate nodes of the knowledge graph; orchestrating tools by means of the Tool Orchestration Engine to query the knowledge graph, identify candidate tools corresponding to the detected intent, rank said tools using a scoring function comprising similarity, usage weight, availability, and graph centrality, and generate an execution schema; adapting tool schemas by means of the Large Language Model Tool Adapter to convert the execution schema into a structured representation comprising function name, arguments, output format, and invocation strategy, suitable for consumption by a large language model; providing the large language model with the user prompt and adapted tool schema to enable generation of execution commands; executing the task by invoking the selected tool or endpoint according to the generated execution command; and returning a response to the user corresponding to the execution result.
BRIEF DESCRIPTION OF DRAWINGS
A complete understanding of the present invention may be made by reference to the following detailed description which is to be taken in conjugation with the accompanying drawing. The accompanying drawing, which is incorporated into and constitutes a part of the specification, illustrates one or more embodiments of the present invention and, together with the detailed description, it serves to explain the principles and implementations of the invention.
FIG. 1 illustrates the sequence diagram of the system of the present invention.
FIG.2 illustrates the workflow diagram of the system of the present invention.
FIG.3 illustrates the component interactions of the system of the present invention.
DETAILED DESCRIPTION OF INVENTION:
Before the present invention is described, it is to be understood that this invention is not limited to methodologies described, as these may vary as per the person skilled in the art. It is also to be understood that the terminology used in the description is for the purpose of describing the particular embodiments only and is not intended to limit the scope of the present invention. Throughout this specification, the word “comprise”, or variations such as “comprises” or “comprising”, will be understood to imply the inclusion of a stated element, integer or step, or group of elements, integers or steps, but not the exclusion of any other element, integer or step, or group of elements, integers or steps. The use of the expression “at least” or “at least one” suggests the use of one or more elements or ingredients or quantities, as the use may be in the embodiment of the invention to achieve one or more of the desired objects or results. Various embodiments of the present invention are described below. It is, however, noted that the present invention is not limited to these embodiments, but rather the intention is that modifications that are apparent are also included.
The present invention describes a system and method for discovery, monitoring, and utilization of endpoint ecosystems for dynamic tool generation in LLM-based inference. The system provides for continuously discovering, analyzing, and updating a comprehensive knowledge graph of endpoint components (including applications, packages, tools, web-links, and APIs) based on system activity, network behavior, and installation metadata. It dynamically maps relationships and monitors changes to intelligently determine and generate the most appropriate endpoints as tools for task execution by a local or remote large language model (LLM). The system uses hybrid telemetry, semantic context extraction, graph reasoning, and multi-level orchestration to ensure full utilization of the endpoint ecosystem during inferencing.
According to the embodiment of the present invention, as described in FIG. 1, the system comprises of an input unit , a processing unit and output unit , wherein the processing unit further comprises of Endpoint Discovery Engine module, Activity Telemetry Aggregator module, Web Request Profiler module, Knowledge Graph Constructor module , Change Detection & Delta Engine module , Semantic Intent Analyzer module , Tool Orchestration Engine module and LLM Tool Adapter module.
According to the embodiment of the present invention, Endpoint Discovery Engine module is configured to identify and catalog available executables, services, and extensions across diverse operating environments. The module employs native operating system APIs, command-line utilities, and package management frameworks such as apt, brew, npm, and pip to enumerate installed artifacts. It uses a hybrid static-dynamic strategy. In the static mode, this module parses file system hierarchies, registry entries, and application manifests to establish a baseline inventory. In the dynamic mode, the module monitors process creation events, software installation logs, and plugin load sequences. A unique fingerprinting methodology is applied, wherein the engine matches known patterns such as shell aliases and integrated development environment (IDE) plugins, and further applies heuristics to detect lightweight or web-based tools that are not traditionally registered within the operating system.
According to the embodiment of the present invention, the Aggregated Telemetry Analyzer module is provided for capturing runtime behavioral data from diverse sources. In Windows environments, the subsystem integrates with Event Logs, Sysmon, and Task Manager, while in macOS environments, it interfaces with Activity Monitor. In UNIX and Linux systems, the module draws upon /proc/, lsof, netstat, and auditd logs. The captured data includes process-level attributes such as CPU and memory consumption, co-process invocation, parent–child process relationships, execution duration, and input-output activity patterns. A unique feature of the module is the generation of time-weighted rolling graphs, which distinctly capture short-term and long-term behavior patterns, thereby providing both transient and persistent operational signatures.
According to the embodiment of the present invention, the system further incorporates a Web Request Profiler module configured to intercept and associate outbound network requests with their originating processes. The module captures request-level metadata, including REST endpoints, authentication headers, payload schemas, and timing information. Multiple inspection techniques are employed, including proxy-based traffic inspection, kernel-level hooks implemented via eBPF or WinPcap, and parsing of DNS logs and HTTP Archive (HAR) files. Each request is semantically tagged to reflect its operational purpose, such as data retrieval, authentication, or configuration loading.
According to the embodiment of the present invention, the Knowledge Graph Constructor module builds a Knowledge Graph, wherein nodes represent applications, tools, scripts, web endpoints, and packages, and edges represent relationships such as dependency, co-usage, functional similarity, and plugin association. The Knowledge Graph is further enriched with metadata, including version identifiers, operating system compatibility, access rights, and frequency of usage. In addition, the system applies natural language processing to documentation sources such as readme files, configuration scripts, and online manuals to generate semantic tags. Temporal graphing mechanisms are employed to visualize changes in the environment over time, enabling longitudinal analysis.
According to the embodiment of the present invention, Continuous Delta and Drift Engine module
applies snapshot hashing and file system watchers to detect modifications within the computing environment. Such modifications include installation or removal of packages, binary updates, and alterations in script files. A novel delta propagation strategy is employed, wherein only affected graph regions are updated rather than re-generating the entire graph, thereby improving efficiency. Furthermore, redundancy in change records is eliminated by compressing them into vectorized change representations.
According to the embodiment of the present invention, Semantic Intent Analyzer module parses the user input to extract task intent. The module provides for mapping user-provided natural language input into actionable task representations. The analyser leverages fine-tuned transformer-based machine learning models to classify inputs into intent categories, such as data compression, simulation execution, or network measurement. Contextual reasoning is applied to link the detected intent to relevant nodes within the Knowledge Graph. For example, a user input such as “check internet speed” may be automatically mapped to tools such as speed test-cli, curl, or browser-based portals.
According to the embodiment of the present invention, the Tool Orchestration Engine module queries the Knowledge Graph to identify optimal execution endpoints. Candidate tools are scored based on a composite metric incorporating semantic intent matching, historical usage weight, and recency of invocation. The engine further generates invocation schemas adapted to the tool modality, including command-line wrappers, API contracts, or macro-driven graphical automation. A unique redundancy-avoidance strategy is employed wherein tools belonging to similar functional families are clustered, and duplicate endpoints are suppressed.
According to the embodiment of the present invention, the Large Language Model Tool Adapter module transforms the selected tool invocation into a format directly consumable by large language models. This includes the generation of structured schemas comprising function identifiers, argument definitions, descriptive metadata, output parsing instructions, and invocation strategies, which may include direct execution, remote procedure call, HTTP invocation, or event triggering. The formatted schema is transmitted to the large language model alongside the user prompt, thereby enabling automated, context-aware orchestration of computing tools.
According to the embodiment of the present invention, as described in FIG.2, the method for discovery, monitoring, and utilization of endpoint ecosystems for dynamic tool generation in LLM-based inference comprises the steps of
• initializing a system environment to prepare components for endpoint discovery and telemetry collection;
• executing the Endpoint Discovery Engine to identify and enumerate executables, services, packages, and extensions within a computing environment using static inspection of file systems, registries, and manifests, and dynamic monitoring of process creation, installation events, and plugin loads;
• aggregating telemetry data by means of the Activity Telemetry Aggregator to capture runtime signals including process activity, resource utilization, execution duration, and parent–child process relationships from operating system logs and monitoring interfaces;
• profiling web requests by means of Web Request Profiler to intercept outbound traffic, associate such traffic with originating processes, and extract request-level metadata including endpoints, authentication headers, payload schemas, and timing information;
• constructing the Knowledge Graph comprising nodes representing applications, tools, scripts, or web endpoints and edges representing dependencies, co-usage, or functional similarity, the graph further enriched with metadata, temporal overlays, and semantic tags;
• detecting changes in the environment by means of the Change Detection and Delta Engine, including installation, removal, update, or modification of tools, and propagating incremental updates to the knowledge graph using snapshot hashing and delta compression;
• analysing semantic intent by means of Semantic Intent Analyzer to receive natural language input, extract task intent using machine learning models, and map said intent to candidate nodes of the knowledge graph;
• orchestrating tools by means of the Tool Orchestration Engine to query the knowledge graph, identify candidate tools corresponding to the detected intent, rank said tools using a scoring function comprising similarity, usage weight, availability, and graph centrality, and generate an execution schema;
• adapting tool schemas by means of the Large Language Model Tool Adapter to convert the execution schema into a structured representation comprising function name, arguments, output format, and invocation strategy, suitable for consumption by a large language model;
• providing the large language model with the user prompt and adapted tool schema to enable generation of execution commands;
• executing the task by invoking the selected tool or endpoint according to the generated execution command; and
• returning a response to the user corresponding to the execution result.
According to the embodiment of the present invention, as described in FIG. 3, the system employs a multi-layer graph data structure to represent and maintain the evolving ecosystem of applications, tools, and user interactions. The graph comprises three distinct layers: Core Functional Graph – This layer represents the baseline functional relationships between applications, executables, packages, and web endpoints. Edges capture dependencies, plugin relationships, and co-usage patterns; Temporal Graph Overlay – This layer maintains temporal metadata describing how the functional graph evolves over time. Changes such as tool installation, binary updates, or configuration alterations are captured in a time-stamped overlay, enabling longitudinal analysis of environment drift; User Affinity Graph – This layer encodes personalized interaction data, including user preferences, frequency of usage, and contextual associations. It refines tool selection by incorporating behavioral affinity scores, thereby aligning recommendations with individual usage patterns. The three-layer approach ensures that the graph maintains both structural integrity and contextual richness, supporting efficient query execution for downstream orchestration.
According to the embodiment of the present invention, to identify associations between tasks and processes, the system applies a weighted bipartite graph model, wherein one partition represents processes and the other represents user or system tasks. On this bipartite graph, the system applies spectral clustering techniques to partition nodes into clusters representing strong associations. For enabling interoperability with large language models, the system incorporates a mechanism for dynamically generating tool schemas. For each candidate endpoint identified in the Knowledge Graph, the following operations are performed: Extract Interface – Metadata and signatures associated with the endpoint are collected from manifests, documentation, or binary introspection; Parse Usage Logs – Historical usage logs are analyzed to capture argument patterns, invocation styles, and typical parameter values; Infer Argument Types – Machine learning heuristics are applied to determine the expected data types (e.g., integer, string, file path) for each argument and generate JSON Schema – Based on the above information, a structured schema is produced in JSON format, encapsulating function name, arguments, type constraints, and expected output descriptors. This dynamic schema generation allows the system to continuously adapt to newly discovered endpoints without requiring manual integration. The system then applies a composite scoring function to rank candidate tools against extracted user intent. The score for a given tool is defined as:
Score(Tool)=SIM(Intent, Tool.Description)×UsageScore×Availability×GraphCentrality
Where:
SIM(Intent, Tool.Description) represents the semantic similarity between the user’s expressed intent and the tool’s descriptive metadata, UsageScore reflects historical usage frequency and success rate, Availability denotes current accessibility of the tool within the system environment, Graph Centrality represents the structural importance of the tool within the Knowledge Graph, emphasizing tools that are highly connected and widely referenced. This scoring function ensures that tool selection balances semantic accuracy, practical availability, and structural relevance within the environment.
According to the embodiment of the present invention, the system of present invention is applicable across a wide variety of computing environments and user scenarios. Representative use cases include, but are not limited to, the following: Developer Assistance – A developer-facing agent may employ the system to automatically select and configure local test tools based on project type and context; IT Administration – An information technology assistant may invoke critical utilities such as virtual private networks (VPN), synchronization services, or authentication mechanisms in response to user commands or policy triggers; Knowledge Workflows – A knowledge worker may employ the system to coordinate research pipelines consisting of multiple scripts, data processing tools, and visualization utilities. The system automatically discovers, chains, and executes the appropriate resources, thereby reducing manual integration effort and LLM-Enhanced Task Runners – The system integrates with large language models to provide enhanced task execution capabilities.
The present invention offers several distinct advantages over conventional approaches:
• Unlike static registries that require manual curation and periodic updates, the disclosed system maintains a self-updating registry of executables, services, and extensions, derived through continuous endpoint discovery.
• The system incorporates delta monitoring and temporal graph overlays to maintain an up-to-date representation of the computing environment. This enables immediate recognition of newly installed tools, updated binaries, or removed services.
• The invention uniquely integrates discovery across operating system artifacts, web endpoints, and package ecosystems. This holistic coverage ensures that both native applications and lightweight, web-based utilities are included in the orchestration process.
• Tool selection and orchestration are performed through a knowledge graph that captures functional relationships, dependencies, and user affinities. This graph-based representation allows for context-aware reasoning and optimized toolchain formation.
• Through dynamic schema generation and LLM tool adapters, the system is capable of synthesizing new, executable tool definitions in real time, thereby reducing integration overhead and expanding applicability to novel endpoints.
, Claims:We claim,
1. A system and method for discovery, monitoring, and utilization of endpoint ecosystems for dynamic tool generation in LLM-based inference
characterised in that
the system dynamically maps relationships and monitors changes to intelligently determine and generate the most appropriate endpoints as tools for task execution by a local or remote large language model;
the system comprises of an input unit , a processing unit and output unit , wherein the processing unit further comprises of endpoint discovery engine module, activity telemetry aggregator module, web request profiler module, knowledge graph constructor module , change detection & delta engine module , semantic intent analyser module , tool orchestration engine module and LLM tool adapter module;
the method for discovery, monitoring, and utilization of endpoint ecosystems for dynamic tool generation in LLM-based inference comprises the steps of
• initializing a system environment to prepare components for endpoint discovery and telemetry collection;
• executing the Endpoint Discovery Engine to identify and enumerate executables, services, packages, and extensions within a computing environment using static inspection of file systems, registries, and manifests, and dynamic monitoring of process creation, installation events, and plugin loads;
• aggregating telemetry data by means of the Activity Telemetry Aggregator to capture runtime signals including process activity, resource utilization, execution duration, and parent–child process relationships from operating system logs and monitoring interfaces;
• profiling web requests by means of Web Request Profiler to intercept outbound traffic, associate such traffic with originating processes, and extract request-level metadata including endpoints, authentication headers, payload schemas, and timing information;
• constructing the Knowledge Graph comprising nodes representing applications, tools, scripts, or web endpoints and edges representing dependencies, co-usage, or functional similarity, the graph further enriched with metadata, temporal overlays, and semantic tags;
• detecting changes in the environment by means of the Change Detection and Delta Engine, including installation, removal, update, or modification of tools, and propagating incremental updates to the knowledge graph using snapshot hashing and delta compression;
• analysing semantic intent by means of Semantic Intent Analyzer to receive natural language input, extract task intent using machine learning models, and map said intent to candidate nodes of the knowledge graph;
• orchestrating tools by means of the Tool Orchestration Engine to query the knowledge graph, identify candidate tools corresponding to the detected intent, rank said tools using a scoring function comprising similarity, usage weight, availability, and graph centrality, and generate an execution schema;
• adapting tool schemas by means of the Large Language Model Tool Adapter to convert the execution schema into a structured representation comprising function name, arguments, output format, and invocation strategy, suitable for consumption by a large language model;
• providing the large language model with the user prompt and adapted tool schema to enable generation of execution commands;
• executing the task by invoking the selected tool or endpoint according to the generated execution command; and
• returning a response to the user corresponding to the execution result.
2. The system and method as claimed in claim 1, wherein Endpoint Discovery Engine module is configured to identify and catalog available executables, services, and extensions across diverse operating environments and it employs native operating system APIs, command-line utilities, and package management frameworks to enumerate installed artifacts.
3. The system and method as claimed in claim 1, wherein the Aggregated Telemetry Analyzer module is provided for capturing runtime behavioral data from diverse sources and it generates time-weighted rolling graphs, which distinctly capture short-term and long-term behavior patterns, thereby providing both transient and persistent operational signatures.
4. The system and method as claimed in claim 1, wherein the Web Request Profiler module is configured to intercept and associate outbound network requests with their originating processes as it captures request-level metadata, including REST endpoints, authentication headers, payload schemas, and timing information and employs multiple inspection techniques.
5. The system and method as claimed in claim 1, wherein the Knowledge Graph Constructor module builds a Knowledge Graph, wherein nodes represent applications, tools, scripts, web endpoints, and packages, and edges represent relationships such as dependency, co-usage, functional similarity, and plugin association and the graph is enriched with metadata, including version identifiers, operating system compatibility, access rights, and frequency of usage.
6. The system and method as claimed in claim 1, wherein Continuous Delta and Drift Engine module applies snapshot hashing and file system watchers to detect modifications within the computing environment such as installation or removal of packages, binary updates, and alterations in script files and a novel delta propagation strategy is employed, wherein only affected graph regions are updated rather than re-generating the entire graph, thereby improving efficiency.
7. The system and method as claimed in claim 1, wherein Semantic Intent Analyzer module parses the user input to extract task intent and the module provides for mapping user-provided natural language input into actionable task representations by leveraging fine-tuned transformer-based machine learning models to classify inputs into intent categories, such as data compression, simulation execution, or network measurement.
8. The system and method as claimed in claim 1, wherein the Tool Orchestration Engine module queries the Knowledge Graph to identify optimal execution endpoints and candidate tools are scored based on a composite metric incorporating semantic intent matching, historical usage weight, and recency of invocation and the engine further generates invocation schemas adapted to the tool modality, including command-line wrappers, API contracts, or macro-driven graphical automation such that tools belonging to similar functional families are clustered, and duplicate endpoints are suppressed.
9. The system and method as claimed in claim 1, wherein the Large Language Model Tool Adapter module transforms the selected tool invocation into a format directly consumable by large language models and this includes the generation of structured schemas comprising function identifiers, argument definitions, descriptive metadata, output parsing instructions, and invocation strategies, which may include direct execution, remote procedure call, HTTP invocation, or event triggering.
10. The system and method as claimed in claim 1, wherein the system employs a multi-layer graph data structure to represent and maintain the evolving ecosystem of applications, tools, and user interactions and the graph comprises three distinct layers including Core Functional Graph; Temporal Graph Overlay and User Affinity Graph to ensure that the graph maintains both structural integrity and contextual richness, supporting efficient query execution for downstream orchestration.
| # | Name | Date |
|---|---|---|
| 1 | 202521082259-STATEMENT OF UNDERTAKING (FORM 3) [29-08-2025(online)].pdf | 2025-08-29 |
| 2 | 202521082259-POWER OF AUTHORITY [29-08-2025(online)].pdf | 2025-08-29 |
| 3 | 202521082259-FORM 1 [29-08-2025(online)].pdf | 2025-08-29 |
| 4 | 202521082259-FIGURE OF ABSTRACT [29-08-2025(online)].pdf | 2025-08-29 |
| 5 | 202521082259-DRAWINGS [29-08-2025(online)].pdf | 2025-08-29 |
| 6 | 202521082259-DECLARATION OF INVENTORSHIP (FORM 5) [29-08-2025(online)].pdf | 2025-08-29 |
| 7 | 202521082259-COMPLETE SPECIFICATION [29-08-2025(online)].pdf | 2025-08-29 |
| 8 | Abstract.jpg | 2025-09-20 |
| 9 | 202521082259-FORM-9 [26-09-2025(online)].pdf | 2025-09-26 |
| 10 | 202521082259-FORM 18 [01-10-2025(online)].pdf | 2025-10-01 |