Sign In to Follow Application
View All Documents & Correspondence

System And Method For Workflow Orchestration Using Tools, Model Compute Platform And Large Language Models

Abstract: ABSTRACT: Title: SYSTEM AND METHOD FOR WORKFLOW ORCHESTRATION USING TOOLS, MODEL-COMPUTE-PLATFORM AND LARGE LANGUAGE MODELS The present invention discloses a system (10) and method for workflow orchestration using tools, model-compute-platforms (MCPs), and large language models (LLMs); wherein the system comprises an input unit (1), a processing unit (2), and an output unit (3). The processing unit (2) includes multiple components: an input handler (21) to parse natural language queries, a historical context engine (22) to infer domain and intent using prior data, a subquery generator (23) to decompose queries, a workflow generator (24) to generate stepwise logic, and a task planner and enricher (25) to enrich metadata with the steps. A graph orchestrator (26) builds a directed acyclic graph (DAG), while a pathfinder module (27) provides fallback paths for failure recovery. The execution engine (28) performs node-by-node execution using LLMs, tools, and MCPs. Results are compiled and formatted by a result compiler (29) and presented through the output unit (3). The system enables adaptive, robust, and intelligent multi-step workflow execution with domain-specific reasoning.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 July 2025
Publication Number
41/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Persistent Systems
Bhageerath, 402, Senapati Bapat Rd, Shivaji Cooperative Housing Society, Gokhale Nagar, Pune - 411016, Maharashtra, India.

Inventors

1. Mr. Nitish Shrivastava
10764 Farallone Dr, Cupertino, CA 95014-4453, United States.
2. Mr. Pradeep Sharma
20200 Lucille Ave Apt 62 Cupertino CA 95014, United States.

Specification

Description:FIELD OF INVENTION
The present invention relates to software development, automation and workflow orchestration. More specifically, it pertains to a system and method for workflow orchestration using tools, model-compute-platform (MCP) and large language model.

BACKGROUND OF THE INVENTION
Workflow orchestration is crucial for improving efficiency, reducing errors and enhancing collaboration in various process, particularly in complex and data-intensive environment. It focuses on automating individual tasks, workflow orchestration creates a connected ecosystem where these automated tasks interact efficiently, follow a logical sequence and integrate with other systems to achieve an end-to-end business process. By coordinating tasks for example, data processing, notifications, approvals and system updates, successful workflow orchestration reduces errors and optimizes operations.
Conventional workflow orchestration systems rely on predefined rule-based logic and fixed tool integrations, often failing to adapt dynamically to evolving queries and contexts. Moreover, they lack the ability to reason through abstract, high-level queries or use historical context to generate personalized and optimized workflows. Existing advancement in LLMs have shown great promise in understanding and generating human-like instruction, but gap remains in translating high-level user intent into actionable, orchestration systems that can leverage LLM reasoning, tools and compute resources in tandem. However, systems face several challenges, primarily revolving around integration complexity, scalability, security and change management. Integrating disparate systems and data silos, managing the complexities of scaling workflows, addressing security concerns, and overcoming resistance to change within an organization are key hurdles.
PRIOR ART
202541015666 discloses a system and method for automating the workflow using artificial intelligence. The system comprises an inputting module to collect and aggregate user details; an organization user management module to process user details by assigning roles and defining access permission; a task trigger module to detect and receive task notification from an external triggered task or an internal triggered task, extract task-specific content from notification to generate a structured task data; a prompt creation module to process the structured task data to generate an optimized AI prompt; a localized organization specific large language model (LOS-LLM) to process optimized AI prompt to generate a contextually relevant AI response; a data storage to store the AI response in at least one databases; a data retrieval module to process user query received via a user interface and present retrieved data to the user via the user interface.
US9870295B2 discloses a system includes a processor and a non-transitory computer-readable medium. The non-transitory computer-readable medium comprises instructions executable by the processor to cause the system to perform a method. The method comprises receiving a first job to execute and executing the first job. A plurality of data associated with the first job is determined. The plurality of data comprises data associated with, a second job executed immediately prior to the first job, a third job executed immediately after the first job, a determination of whether the first job failed or executed successfully and a type of data associated with the first job. The determined plurality of data is stored. In second invention system
Although the prior arts disclose system and a method for automating the workflow using artificial intelligence; or a processor and a non-transitory computer-readable medium. While the inputting module is configured to collect and aggregate user details from multiple sources including student portals, faculty inputs, and institutional repositories; or the other system determines that an analytic job has failed may not be immediate due to the time needed to process large datasets. Therefore, the user who submitted the workflow may be required to rerun certain portions of the workflow when a condition occurs that the workflow cannot accommodate (e.g., a fault condition). Rerunning portions of the workflow may delay a workflow from completing and, when a large dataset is being analyzed, the delays may be a factor of days. Therefore, reducing delays in the execution of a workflow of analytic jobs is desirable.
Owing to the aforementioned shortcomings of the available system, there is a need of an optimized workflow orchestration system as that of the present invention which relates to the field of generative artificial intelligence, automation, and workflow orchestration. Specifically, it pertains to a novel system and method for dynamically generating, orchestrating, and executing workflows using Large Language Models (LLMs), tools, and a Model-Compute-Platform (MCP) driven by user queries and historical data.

DEFINITIONS:
The expression “system” used hereinafter in this specification refers to an ecosystem comprising, but not limited to, system for automatically defining post-deployment success metrics with input and output devices, processing unit, plurality of mobile devices, a mobile device-based application. It is extended to computing systems like mobile phones, laptops, computers, PCs, and other digital computing devices.
The expression “input unit” used hereinafter in this specification refers to, but is not limited to, mobile, laptops, computers, PCs, keyboards, mouse, pen drives or drives.
The expression “output unit” used hereinafter in this specification refers to, but is not limited to, an onboard output device, a user interface (UI), a display unit, a local display, a screen, a dashboard, or a visualization platform enabling the user to visualize the graphs provided as output by the system.
The expression “processing unit” refers to, but is not limited to, a processor of at least one computing device that optimizes the system, and acts as the functional unit of the system.
The expression “model-compute platform” (MCP) used hereinafter in this specification refers to a framework that provides the required infrastructure and tools to handle a wide range of workloads, from basic apps to complicated computational processes, and can be deployed on-premises, in the cloud, or at the edge. By underpinning software processes, compute platforms help businesses streamline and adapt.
The expression “directed acyclic graph” (DAG) used hereinafter in this specification refers to a type of graph data structure where nodes (or vertices) are connected by directed edges (arrows), and there are no cycles which means the user cannot end up at the same node from where it has started.
The expression “orchestration” refers to the automated coordination and management of multiple processes, systems, and services to execute a larger workflow or process. It involves streamlining and optimizing the execution of repeatable tasks across different systems, often involving complex workflows and dependencies.
The expression “Large Language Models” or “LLMs” used hereinafter in this specification refers to systems that use natural language understanding to interpret and generate text. In this system, they help extract features and suggest metrics.

OBJECTS OF THE INVENTION:
The primary object of the present invention is to provide a system and method for workflow orchestration using tools, model-compute-platform (MCP) and large language model.
Another object of the present invention is to provide a system and method that enables dynamic decomposition of user queries into subqueries.
Yet another object of the present invention is to provide a system and method that enables contextual enrichment via historical query data.
Yet another object of the present invention is to provide a system and method that generates continuous learning and optimizing via historical enrichment.
Yet another object of the present invention is to provide a system and method that enables recursive orchestration of workflow steps and sub-steps using LLMs, tools and model-compute-platform (MCP).
Yet another object of the present invention is to provide a system and method that enables graph-based orchestration and traversal of execution paths including failovers.
Yet another object of the present invention is to provide a system and method that enables comprehensive prompt construction using domain, intent to guide LLM interaction.
Yet another object of the present invention is to provide system and method that enables execution monitoring and presentation of final results in user-specified styles.
Yet another object of the invention is to provide an adaptive and modular system and method that improves output quality, enabling the system to produce highly personalized, reliable, and accurate results.

SUMMARY
Before the present invention is described, it is to be understood that the present invention is not limited to specific methodologies and materials described, as these may vary as per the person skilled in the art. It is also to be understood that the terminology used in the description is for the purpose of describing the particular embodiments only and is not intended to limit the scope of the present invention.
The present invention discloses a system and method for workflow orchestration using tools, model-compute-platform (MCP) and large language models (LLMs); wherein the system comprises an input unit, a processing unit, and an output unit. The processing unit (2) includes an input handler (21), a historical context engine (22), a subquery generator (23), a workflow generator (24), a task planner and enricher (25), a graph orchestrator (26), a pathfinder module (27), an execution engine (28), and a result compiler (29). These components work in coordination to process user queries, orchestrate workflows using LLMs, tools, and model-compute-platforms (MCPs), and deliver results.
The method begins with the user submitting a natural language query via the input unit. The input handler (21) parses the query, and the historical context engine (22) retrieves past data to infer domain and intent. The subquery generator (23) decomposes the query into subqueries, and the workflow generator (24) uses an LLM to define an initial logical sequence of steps. The task planner and enricher (25) adds metadata to enhance context, and the graph orchestrator (26) constructs a directed acyclic graph (DAG) of these steps. The execution engine (28) executes the workflow node-by-node using LLMs, tools, and MCPs, while the pathfinder module (27) identifies fallback paths in case of failures. The result compiler (29) aggregates and formats the results, which are delivered to the user via the output unit (3).
The system is adaptive, learning from user behavior and historical queries to deliver context-aware results. It supports complex, multi-step reasoning and action planning, and is modular and extensible to accommodate new tools, LLMs, and domains. The pathfinder module enhances fault tolerance through LLM-driven fallback strategies. Continuous enrichment from historical data improves output quality, enabling the system to produce highly personalized, reliable, and accurate results.

BRIEF DESCRIPTION OF DRAWINGS
A complete understanding of the present invention may be made by reference to the following detailed description which is to be taken in conjugation with the accompanying drawing. The accompanying drawing, which is incorporated into and constitutes a part of the specification, illustrates one or more embodiments of the present invention and, together with the detailed description, it serves to explain the principles and implementations of the invention.
FIG. 1. illustrates a schematic representation of the structural and functional components of the system.
FIG.2. illustrates the stepwise method employed by the system for workflow orchestration.

DETAILED DESCRIPTION OF INVENTION:
Before the present invention is described, it is to be understood that this invention is not limited to methodologies described, as these may vary as per the person skilled in the art. It is also to be understood that the terminology used in the description is for the purpose of describing the particular embodiments only and is not intended to limit the scope of the present invention. Throughout this specification, the word “comprise”, or variations such as “comprises” or “comprising”, will be understood to imply the inclusion of a stated element, integer or step, or group of elements, integers or steps, but not the exclusion of any other element, integer or step, or group of elements, integers or steps. The use of the expression “at least” or “at least one” suggests the use of one or more elements or ingredients or quantities, as the use may be in the embodiment of the invention to achieve one or more of the desired objects or results. Various embodiments of the present invention are described below. It is, however, noted that the present invention is not limited to these embodiments, but rather the intention is that modifications that are apparent are also included.
The present invention describes a system and method for workflow orchestration using tools, model-compute-platform (MCP) and large language models (LLMs). The system (10) comprises of an input unit (1), processing unit (2) further comprising of an input handler (21), a historical context engine (22), a subquery generator (23), a workflow generator (24), a task planner and enricher (25), a graph orchestrator (26), a pathfinder module (27), an execution engine (28) and a result compiler (29), and an output unit (3); where all the structural and functional components work in co-ordination to employ a method for workflow orchestration.
In an embodiment of the invention, the input handler (21) accepts natural language queries from the user provided using at least one input unit, and converts the queries into structured representations for further processing to the historical context engine (22). The historical context engine (22) maintains a repository of past queries, user metadata, domains, and intents; uses embeddings and clustering to retrieve and incorporate contextually similar queries and helps in domain and intent inference.
In a next embodiment of the invention, the subquery generator (23) splits or decomposes the primary user query into logically distinct subqueries using LLMs, where the subqueries are used to seed the orchestration graph. For example, where a request or task is entered by the user, the subquery generator decomposes it recursively into sub-steps using a specific agent such as “think”, followed by creating a plan and reasoning trail for each sub-step, further delegating execution to tool integrated agents such as CodeAgent or SearchAgent, then use a validator agent to verify the task completion. If the validation fails, the system allows the user to retry or re-plan, and maintain a memory or context between the agents. Thus, it uses an LLM to assist at each step, thereby bringing the task to a unit level that any LLM understands, thereby maintain higher accuracy.
In yet a next embodiment, the workflow generator (24) employs a large language model (LLM) to formulate an initial logical sequence of steps to answer the query, such that each step contains a description, dependencies, expected inputs, and outputs. The task planner and enricher (25) is configured to enrich each step of the sequence with metadata including, but not limited to domain, intent, format, prior success patterns, thereby enabling the LLMs to generate comprehensive prompts with historical and domain-specific nuances, such that the system fuses domain, intent, and historical success into advanced prompts for “wow” results.
In yet a next embodiment of the invention, the graph orchestrator (26) constructs a directed acyclic graph (DAG) from the sequence of steps and its sub-steps where the nodes represent steps and edges define dependencies and conditional paths that depict success or failure. Further, the pathfinder module (27) uses LLMs to explore alternate branches and backtracking strategies and determines alternate paths for failure recovery using LLMs.
In yet a next embodiment, the execution engine (28) traverses the graph node-by-node; such that the system uses LLMs, tools, and MCP interfaces at each node, to execute the required task. Once the tasks are executed, the execution engine (28) stores the results after verification and then passed downstream to the result compiler (29) which compiles the results once the graph is fully traversed, using style, format, and visualization templates and delivers the final result in a user-friendly format using at least one output unit (3).
In a preferred embodiment of the invention, the method for workflow orchestration includes the steps as follows:
 Providing a natural language input query by the user using an input unit (1),
 accepting and parsing the primary user query by the input handler (21),
 retrieving historical queries that enables inferring domain and intent by the historical context engine (22),
 decomposing the primary queries and generating sub-queries by the subquery generator (23),
 employing a large language model (LLM) to formulate an initial logical sequence of steps to answer the query by the workflow generator (24),
 enriching each step with metadata and expanding the predefined steps into sub-steps by the task planner and enricher (25),
 constructing a directed acyclic graph (DAG) from the sequence of steps and its sub-steps by the graph orchestrator (26) such that the nodes represent steps and edges define dependencies and conditional paths,
 executing the required task by traversing the graph node-by-node using LLMs, tools, and MCP interfaces at each node, and
 storing the results after traversing all steps successfully using the execution engine (28) and passing the results downstream,
 finding a fallback path using LLMs to explore alternate branches and backtracking strategies thereby determining alternate paths for failure recovery, enabling the system to reverse the workflow to generating another prompt,
 aggregating the results by the result compiler (29) using style, format, and visualization templates,
 delivering the final result to the user in a user-friendly format using at least one output unit (3).
According to another embodiment of the invention, the present system and method provide significant advantages such as the system is adaptive to user behavior and preferences over time, is capable of complex multi-step reasoning and action planning. It is modular, extensible with new tools, LLMs, and domains; and is robust against failure through LLM-driven pathfinding. Further, the system and method is configured for continuous learning and optimizing via historical enrichment, that enable the system to produces highly tailored, high-quality results.
While considerable emphasis has been placed herein on the specific elements of the preferred embodiment, it will be appreciated that many alterations can be made and that many modifications can be made in preferred embodiment without departing from the principles of the invention. These and other changes in the preferred embodiments of the invention will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the invention and not as a limitation.
, Claims:
CLAIMS:
We claim,
1. A system and method for workflow orchestration using tools, model-compute-platform (MCP) and large language models (LLMs);
wherein the system (10) comprises of an input unit (1), a processing unit (2) further comprising of an input handler (21), a historical context engine (22), a subquery generator (23), a workflow generator (24), a task planner and enricher (25), a graph orchestrator (26), a pathfinder module (27), an execution engine (28) and a result compiler (29), and an output unit (3); where all the structural and functional components work in co-ordination to employ a method for workflow orchestration;
characterised in that:
the method for workflow orchestration includes the steps as follows;
 providing a natural language input query by the user using an input unit (1),
 accepting and parsing the primary user query by the input handler (21),
 retrieving historical queries that enables inferring domain and intent by the historical context engine (22),
 decomposing the primary queries and generating sub-queries by the subquery generator (23),
 employing a large language model (LLM) to formulate an initial logical sequence of steps to answer the query by the workflow generator (24),
 enriching each step with metadata and expanding the predefined steps into sub-steps by the task planner and enricher (25),
 constructing a directed acyclic graph (DAG) from the sequence of steps and its sub-steps by the graph orchestrator (26) such that the nodes represent steps and edges define dependencies and conditional paths,
 executing the required task by traversing the graph node-by-node using LLMs, tools, and MCP interfaces at each node, and
 storing the results after traversing all steps successfully using the execution engine (28) and passing the results downstream,
 finding a fallback path using LLMs to explore alternate branches and backtracking strategies thereby determining alternate paths for failure recovery, enabling the system to reverse the workflow to generating another prompt,
 aggregating the results by the result compiler (29) using style, format, and visualization templates,
 delivering the final result to the user in a user-friendly format using at least one output unit (3).

2. The system and method as claimed in claim 1, wherein the input handler (21) accepts queries in the form of natural language from the user, provided using an input unit, and converts the queries into structured representations for further processing.

3. The system and method as claimed in claim 1, wherein the historical context engine (22) maintains a repository of past queries, user metadata, domains, and intents; uses embeddings and clustering to retrieve and incorporate contextually similar queries and helps in domain and intent inference.

4. The system and method as claimed in claim 1, wherein the subquery generator (23) splits or decomposes the primary user query into logically distinct subqueries using LLMs, where the subqueries are used to seed the orchestration graph.

5. The system and method as claimed in claim 1, wherein the workflow generator (24) employs a large language model (LLM) to formulate an initial logical sequence of steps to answer the query, where each step contains a description, dependencies, expected inputs, and outputs.

6. The system as claimed in claim 1, wherein the task planner and enricher (25) enable the system to enrich each step of the sequence with metadata including domain, intent, format, prior success patterns, thereby enabling the LLMs to generate comprehensive prompts with historical and domain-specific nuances.

7. The system and method as claimed in claim 1, wherein the graph orchestrator (26) constructs a directed acyclic graph (DAG) from the sequence of steps and its sub-steps such that the nodes represent steps and edges define dependencies and conditional paths that depict success or failure.

8. The system and method as claimed in claim 1, wherein the pathfinder module (27) uses LLMs to explore alternate branches and backtracking strategies and determines alternate paths for failure recovery using LLMs.

9. The system and method as claimed in claim 1, wherein the execution engine (28) traverses the graph node-by-node; such that the system uses LLMs, tools, and MCP interfaces at each node, to execute the required task; stores the results after verification; and passes to the result compiler (29) configured to compile the results in a user-friendly format using style, format, and visualization templates.

10. The system and method as claimed in claim 1, which is adaptive to user behavior and preferences over time; capable of complex multi-step reasoning and action planning; is modular, extensible with new tools, LLMs, and domains; and is robust against failure through LLM-driven pathfinding; configured for continuous learning and optimizing via historical enrichment; produces highly tailored, high-quality results.

Dated this 11th day of July, 2025.

Documents

Application Documents

# Name Date
1 202521066231-STATEMENT OF UNDERTAKING (FORM 3) [11-07-2025(online)].pdf 2025-07-11
2 202521066231-POWER OF AUTHORITY [11-07-2025(online)].pdf 2025-07-11
3 202521066231-FORM 1 [11-07-2025(online)].pdf 2025-07-11
4 202521066231-FIGURE OF ABSTRACT [11-07-2025(online)].pdf 2025-07-11
5 202521066231-DRAWINGS [11-07-2025(online)].pdf 2025-07-11
6 202521066231-DECLARATION OF INVENTORSHIP (FORM 5) [11-07-2025(online)].pdf 2025-07-11
7 202521066231-COMPLETE SPECIFICATION [11-07-2025(online)].pdf 2025-07-11
8 202521066231-FORM-9 [26-09-2025(online)].pdf 2025-09-26
9 202521066231-FORM 18 [01-10-2025(online)].pdf 2025-10-01
10 Abstract.jpg 2025-10-08