Sign In to Follow Application
View All Documents & Correspondence

Method And System For Service Configuration And Jobflow Execution For Analytical Adaptor

Abstract: Interoperability of systems working on different technology areas is an area of concern from an application point of view, as the execution of an application may require use of systems on different platform. The disclosure herein generally relates to analytical adaptors, and, more particularly, to a method and system for service configuration and job flow execution for analytical adaptor. The job flow creation and execution being performed by the system, for a given input, supports interpretation of the processes at a generic level, irrespective of format of input data, thereby allowing the system to act as an adaptor supporting interfacing between the different systems on the different technology platforms.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
01 September 2021
Publication Number
09/2023
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
kcopatents@khaitanco.com
Parent Application

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point Mumbai Maharashtra India 400021

Inventors

1. KULKARNI, Aniket
Tata Consultancy Services Limited Commerzone Building No 7, Samrat Ashok Path, Yerwada, Pune Maharashtra India 411006
2. HOSUDURG, Anantha Desik Puranam
Tata Consultancy Services Limited Deccan Park, Plot No 1, Survey No. 64/2, Software Units Layout, Serilingampally Mandal, Madhapur, Hyderabad Telangana India 500081
3. ROY, Ashim
Tata Consultancy Services Limited Commerzone Building No 7, Samrat Ashok Path, Yerwada, Pune Maharashtra India 411006
4. DODDA, Surekha
Tata Consultancy Services Limited Chennai One SEZ unit, IG3 Infrastructure Services Ltd, 200 FT Thoraipakkam - Pallavaram Ring Road, Chennai Tamil Nadu India 600096
5. PAUL, Spondita
Tata Consultancy Services Limited -9-62, 6th Floor, Khan Latif Khan Estate, Fateh Maidan Road, Hyderabad Telangana India 500001
6. PATWARDHAN, Nikhil
Tata Consultancy Services Limited Commerzone Building No 7, Samrat Ashok Path, Yerwada, Pune Maharashtra India 411006
7. NALLAMREDDY, Venkata Bala Tripura Sundari
Tata Consultancy Services Limited Deccan Park, Plot No 1, Survey No. 64/2, Software Units Layout, Serilingampally Mandal, Madhapur, Hyderabad Telangana India 500081

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION (See Section 10 and Rule 13)
Title of invention:
METHOD AND SYSTEM FOR SERVICE CONFIGURATION AND JOBFLOW EXECUTION FOR ANALYTICAL ADAPTOR
Applicant
Tata Consultancy Services Limited A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India
Preamble to the description
The following specification particularly describes the invention and the manner in which it is to be performed.

TECHNICAL FIELD
[001] The disclosure herein generally relates to analytical adaptors, and, more particularly, to a method and system for service configuration and job flow execution for analytical adaptor.
BACKGROUND
[002] Execution of an application may require handling data processing related to different technology areas. Various data processing systems are available to handle data processing in the different technology areas. However, due to differences these systems may have in terms of processes, data formats, and so on, it is not possible to interface these systems, and this in turn causes convenience and productivity issues. For example, the data types being used by two systems may be different. As a result, even though these systems are sequentially placed, output of one system may not match a data format as required by the system as input.
SUMMARY
[003] Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one embodiment, a system for service configuration is provided. The system includes one or more hardware processors, an I/O interface, and a memory storing a plurality of instructions. The plurality of instructions when executed, cause the one or more hardware processors to receive values of a plurality of analytical parameters, as input data. The system then populates a plurality of tables using the input data. The system further determines relationship among the plurality of analytical parameters, by applying an analytical algorithm on the plurality of analytical parameters. The system then establishes relationship among the plurality of tables, based on the determined relationship among the plurality of analytical parameters. Further the

system constructs a template script for the input data and invokes the template script through a plurality of parameters. The system then performs a job flow configuration for the input data, wherein performing the job flow configuration comprises generating a job flow for the input data. Further the system executes the job flow to invoke a corresponding analytical tool and at least one data model matching the analytical tool, and results of execution of the job flow are generated.
[004] In another aspect, a processor implemented method of service configuration is provided. In this method, initially values of a plurality of analytical parameters are received as input data, via one or more hardware processors. Further, a plurality of tables are populated using the input data, via the one or more hardware processors. Further, relationship among the plurality of analytical parameters are determined by applying an analytical algorithm on the plurality of analytical parameters, via the one or more hardware processors. Further, a relationship among the plurality of tables is established based on the determined relationship among the plurality of analytical parameters, via the one or more hardware processors. Further, a template script for the input data is constructed, which is then invoked through a plurality of parameters, via the one or more hardware processors. Further, a job flow configuration is performed for the input data, wherein performing the job flow configuration comprises generating a job flow for the input data, via the one or more hardware processors. The job flow is then executed by invoking a corresponding analytical tool and at least one data model matching the analytical tool, via the one or more hardware processors, and accordingly results of execution of the job flow are generated.
[005] In yet another aspect, a non-transitory computer readable medium for service configuration is provided. The non-transitory computer readable medium includes a plurality of instructions which when executed, cause one or more hardware processors to perform the following actions for the service configuration. Initially values of a plurality of analytical parameters are received as input data, via one or more hardware processors. Further, a plurality of tables are populated using the input data, via the one or more hardware processors. Further, relationship among the plurality of analytical parameters are determined by

applying an analytical algorithm on the plurality of analytical parameters, via the one or more hardware processors. Further, a relationship among the plurality of tables is established based on the determined relationship among the plurality of analytical parameters, via the one or more hardware processors. Further, a template script for the input data is constructed, which is then invoked through a plurality of parameters, via the one or more hardware processors. Further, a job flow configuration is performed for the input data, wherein performing the job flow configuration comprises generating a job flow for the input data, via the one or more hardware processors. The job flow is then executed by invoking a corresponding analytical tool and at least one data model matching the analytical tool, via the one or more hardware processors, and accordingly results of execution of the job flow are generated.
[006] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[007] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles:
[008] FIG. 1 illustrates a block diagram of a system for service configuration according to some embodiments of the present disclosure.
[009] FIGS. 2A and 2B (collectively referred to as FIG. 2) is a flow diagram depicting steps involved in the process of performing the service configuration using the system of FIG. 1, according to some embodiments of the present disclosure.
[010] FIG. 3 is a flow diagram depicting steps involved in the process of job flow execution by the system of FIG. 1, in accordance with some embodiments of the present disclosure.

DETAILED DESCRIPTION OF EMBODIMENTS
[011] Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the scope of the disclosed embodiments.
[012] Referring now to the drawings, and more particularly to FIG. 1 through FIG. 3, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
[013] FIG. 1 illustrates a block diagram of a system for service configuration according to some embodiments of the present disclosure. In an embodiment, the system 100 includes a processor(s) 104, communication interface device(s), alternatively referred as input/output (I/O) interface(s) 106, and one or more data storage devices or a memory 102 operatively coupled to the processor(s) 104. The system 100 with one or more hardware processors is configured to execute functions of one or more functional blocks of the system 100.
[014] Referring to the components of system 100, in an embodiment, the processor(s) 104, can be one or more hardware processors 104. In an embodiment, the one or more hardware processors 104 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the one or more hardware processors 104 are configured to fetch and execute computer-readable instructions stored in the memory 102. In an embodiment, the system 100 can be implemented in a variety of computing systems including laptop computers,

notebooks, hand-held devices such as mobile phones, workstations, mainframe computers, servers, and the like.
[015] The I/O interface(s) 106 can include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface to display the generated target images and the like and can facilitate multiple communications within a wide variety of networks N/W and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular and the like. In an embodiment, the I/O interface (s) 106 can include one or more ports for connecting to a number of external devices or to another server or devices.
[016] The memory 102 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[017] Further, the memory 102 includes a database 108 that stores all data associated with the service configuration being performed by the system 100. For example, information on the analytical parameters collected as input, populated tables, determined relations among the analytical parameters, determined relationship among the tables, constructed template script, generated job flow, results generated by executing the job flow and so on may be stored in the database 108, at least temporarily. Also, a plurality of executable instructions (which form the non-transitory computer readable medium) are stored in the database 108, such that the executable instructions when executed by the one or more hardware processors, cause the service configuration. Functions of the components of the system 100 are explained in conjunction with FIG. 2 and FIG. 3.
[018] FIGS. 2A and 2B (collectively referred to as FIG. 2) is a flow diagram depicting steps involved in the process of performing the service configuration using the system of FIG. 1, according to some embodiments of the present disclosure.

[019] The system 100 may be configured to serve as an adaptor between two different test systems working on different technologies, to facilitate training of analytical tools in the test systems. The adaptor allows interpretation of technologies used by the test systems at a generic level, thereby allowing interoperability between the test systems. The flow diagram of method 200 is executed by the system 100 of FIG. 1, in accordance with embodiments of the present disclosure. In an embodiment, the system 100 comprises one or more data storage devices or the memory 102 operatively coupled to the processor(s) 104 and is configured to store instructions for execution of steps of the method 200 by the processor(s) or one or more hardware processors 104. The steps of the method 200 of the present disclosure will now be explained with reference to the components or blocks of the system 100 as depicted in FIG. 1and the steps of flow diagram as depicted in FIG. 2 and FIG. 3. Although process steps, method steps, techniques or the like may be described in a sequential order, such processes, methods, and techniques may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps to be performed in that order. The steps of processes described herein may be performed in any order practical. Further, some steps may be performed simultaneously.
[020] The system 100 may be interfaced with the test systems using one or more suitable channels/interfaces provided by the I/O interface 106. In order to perform the service configuration, at step 202, the system 100 collects/receives values of a plurality of analytical parameters as input, from the test system. For example, if the analytical parameter is ‘Analytics Engine Type’, then the values may be ‘R’ and ‘Python’. The different analytical parameters and corresponding values may form a process configuration. The process configuration may form part of a ‘requirement’ for the service configuration.
[021] Further, at step 204, the system 100 populates a plurality of tables using the input data, such that a process level data for the service configuration is captured in the tables. The system 100 may use any suitable technique such as but not limited to Natural Language Processing (NLP) to extract the details from the

input data, which can be then stored in the different tables. Based on type of input data, appropriate data processing techniques may be used and the system 100 may be configured accordingly. Examples of the tables are given below:
a. Process master table
The process parameter table is used for storing information on various
processes, which then acts as a metadata table for process configuration.
The system 100 may use the information to further create instances of the
processes during job flow configuration.

Process_descriptio n Process-display_nam e Process_typ e Service_URL_patter n
Python/R adaptor Analytics Engine Analytics Engine URL
Table. 1 b. Process_parameter_configuration_master Table
The system 100 may use this table to store information about various parameters (that can be configured at process level). The table contains different information on various aspects of each analytical parameter, in different columns. For example, the table has columns such as ‘parameter key’, ‘display name’, ‘parameter ui’ and its values. For example, in Table. 2, the analytical parameter considered is ‘AnalyticsEngine’ and it can contain 2 values ‘Python’, ‘R’. Param UI (42) represents combo box UI component.

displaynam Paramkey paramnam paramvalu Paramu
e e e i
Analytics keyAnalyticsEngin Analytics Python, R 42
Engine e Engine
Table. 2

[022] Similar to how various process level information is captured in different tables in step 204, the system 100 also captures information on one or more rules in appropriate tables. In an embodiment, information on the one or more rules may be collected as input by the system 100. Such rules define/determine how the job flow configuration is generated and executed. Some examples of tables that may be used to store the rule information are:
a. Rule_master table
This table may be used by the system 100 to store metadata information on the various rules. The system 100 may create instances of such rules during the configuration of the job flow. For example, as in Table. 3, data such as but not limited to rule name, display_name, rule_description, rule_name, level_datatype may be stored.

Displayname Ruledescription Rulename Leveldatatype
Logistics Regression Logistics Regression rule Logistics Regression 12
Table. 3
b. Process_param_configuration_Table
The system 100 uses the process_param_configuration_master table similar to the one used for storing the process related information, for storing parameters of rule. This is depicted in Table. 4. Table. 4 contains columns ‘param name’, ‘param key’ and so on. For rule parameter ‘InputColumn’ , param UI value is 30. It indicates combo box for selection of columns. Param value is ‘multi select’ indicates multiple columns can be selected during configuration of job flow.

displaynam Paramkey paramname Paramvalu Paramu
e e i
Input keyInputColumn InputColumns Multi 30
columns Select
Output keyOutoutColumn OutputColumn Single 30
column Select
Model File keyModelFilenam ModelFileNam 53
Name e e
Table. 4
[023] Further, at step 206, the system 100 applies the analytical algorithm
(which may be logistic regression, analytical regression and so on, as configured
with the system 100) on the process data and the rule data captured in the tables.
By applying the analytical algorithm, the system 100 determines relationship
among the different analytical parameters captured in the tables. For example,
specific rules may be associated with specific processes. Such
relationship/dependency (even at parameter level) is determined at step 206, and then based on the determined relationship among the analytical parameters, the system 100 determines relationship among different tables. For example, based on the relationship among a process and rule determined at step 206, the system 100 determines that the process master table and rule master table of the process and rule are connected, and accordingly mapping between the tables is determined and established. An example is given below in Table. 5:

processmaster
Id
process_id
process_type
processdescrption
service_url_pattern int
varchar
varchar
varchar
varchar


process_param_configuration_master


param_id
process_id
rule_id
paramname
displayname
paramcatatype
paramui
paramvalue
paramkey int
int
int
varchar
varchar
varchar
varchar
varchar
varchar

rulemaster

rulename id
process_type columndatatype varchar int
varchar varchar





Table. 5
[024] Further, at step 210, the system 100 constructs a template script for the input data. In an embodiment, constructing the template script may involve the system 100 fetching a script pre-configured with the system 100. In an embodiment, the system 100 provides a suitable interface for a user to configure the template script with the system 100, which in turn may be stored in the database 108, and may be called by the system 100. In an embodiment, name of template script should be same as that of name of created rule. This naming convention enables the system 100 to select template script automatically for execution. The system 100 then invokes the template script. The template script, when invoked, invokes the analytical algorithm.
[025] Further, at step 212, the system 100 generates a job flow by performing a job flow configuration for the input data. Performing the job flow configuration involves two steps a) creating an instance of the process, and b) creating an instance of the rule. In general, the job flow defines, based on the

determined relations among the parameters, the rules, the algorithms, and output types, a flow of execution of the process, and forms an instance of the process. For example, the job flow may be in (analytical parameters -> analytical engine -> output types) format. In addition, the created instance of the rule may specify InputColumns, OutputColumns, and a ModelFilename.
[026] Further, after generating the job flow, the system 100 may store information on various parameters of the job flow (i.e. job flow configuration) in a plurality of tables. Some examples of the tables in which the details of the job flow configuration are stored, are processflow_instance table, process_instance table, step_ instance table.
a. processflow_instance table
[027] The system 100 stores structure of jobflow in the processflow_instance table. The table contains columns ‘name’ (to store jobflow name) and ‘processflow_json’. Column processflow_json contains elements namely input block (for example, RDBMS), process block (Analytics Engine) and output block. The table also contains information on connections between these elements. In an embodiment, the system 100 may automatically identify and extract the process details based on the process name (i.e. Analytics Engine in this example). An example is given in Table. 6.

[
{
"elementList":[
{
"label":"InputRelationalDatasource",
"elementId": "DPElement1",
"elementType": "Input",
"elementSubType":"Relational Datasource"
},
{ ,

"label":"Analytics Engine",
"elementId": "DPElement2",
"elementType": "Process",
"elementSubType":"Analytics Engine"
},
{
"label":"OutputRelationalDatasource",
"elementId": "DPElement3",
"elementType": "Output",
"elementSubType":"Relational Datasource"
}
],
"connectorList":[
{
"sourceElementId": "DPElement1",
"targetELementId": "DPElement2"
},
{
"sourceElementId":"DPElement2",
"targetELementId": "DPElement3"
}
],
"processflowName":"jfTestPythonAdapter12"
] }
Table. 6 b. Process_instance table
[028] The system 100 stores instance of the process (that is, the Analytics Engine in this example) in this table. This is instance table for process_master table.

For each process present in the process_master table, an instance of process is created by the system 100 and is captured in the process_instance table (Table. 7). The table.7 shows columns which are (process) ‘name’ and ‘process parameters’. Here process name is ‘Analytics Engine’. Process instance has parameter ‘Python’ which is captured using column process_parameters.

name Process_parameters
Analytics Engine Analytics Engine {“Analytics Engine”: “Python”} {“Analytics Engine”: “R”}
Table. 7
c. Step_instance table
[029] The system 100 stores instance of rule in this table. This is instance table for rule_master table. For each rule present in rule_master table, an instance of rule can be created and stored in the step_instance table. The table (Table. 8) shows columns which are ‘name’ (of rule) and ‘step_parameters’ (rule parameters). For example, here rule name is ‘Logistic Regression’, and rule instance has parameters are InputColumns, OutputColumn and so on. Parameter values (for example, value “thall, stn, propran” for parameter inputColumns ) are also captured in this table.

Name step_parameters
Logistic regression {“InputColumns”: “thall, stn, propran”,
“OutputColumn”:”sex”, “ModelFileName”,
“testPython12”}
Table. 8

[030] Further, at step 214, the system 100 executes the generated job flow. Executing the job flow involves invoking an analytical tool, and at least one data model matching the analytical tool. The system 100 determines the analytical tool, and the at least one data model to be invoked, based on the generated job flow. Various steps involved in the process of executing the job flow are depicted in FIG. 3.
[031] At step 302, the system 100 extracts information on the input data, a data structure, a process name, and a process instance ID, from the tables. For example, the system obtains data row from processflow_instance table by accepting job flow name as input. The system 100 obtains information about input database is obtained using input block details. The system 100 then constructs input database structures (DataSourceArray). The system 100 also locates process name and id for process_instance in repository using process block details.
[032] Further, at step 304, the system 100 then fetches information on process instances from the process_instance table. In this step, the system 100 obtains data row from table process_instance given process block details as input. This data contains information of process (that is, Analytics Engine). It also contains process parameters (that is, Python). The system 100 then constructs an in-memory process structure (for example, a DSProcess). The system 100 also locates rule instance details associated with this process_instance.
[033] Further at step 306, the system 100 fetches information on step_instance, from step_instance table. In this step, the system 100 obtains data row from table step_instance given rule details present in process instance. This data contains details of rule (that is, Logistic Regression) and its parameters (that is, inputColumns and so on) configured with the system 100. The system 100 then creates in-memory structure for the rule. This structure is then attached to the created DSProcess structure.
[034] Further, at step 308, the system 100 fetches process level information from the process_master table, given process_id as input (which is present in process_instance data row). This data contains service_url details for given process (that is, Analytics Engine). The system 100 uses service_url details

to invoke a REST API. The relationship between the tables, that allows the system 100 to identify and extract the process level information for the process_id given as input is depicted in Table. 9.

processinstance processmaster
id int
id int
name varchar process_id varchar
processmasterfk int process_type varchar
processsourcearrayfk int processdescription varchar
description varchar service_url_pattern varchar
outputdatasourcearrayfk process_parameters int bytea



stepinstance rulemaster
id int
rulename varchar
name varchar id int
parentstepidfk 10 int process_type varchar
processinstancefk step_parameters int bytea columndatatype varchar



Table. 9 [035] The process level information, the rule level information, and the instance level information together form a service contract. After constructing the service contract using the combination of the aforementioned data at step 310, further at step 312 the system 100 invokes logic of the process (i.e. the Analytics Engine) using the REST API and results are obtained. This logic first extracts parameters from the service contract, and then invokes the analytics tool. Further, output generated by analytics tool is collected.

[036] Extracting the parameters from the service contract, by the system 100, involves the following steps:
a. Extraction of configured Process
[037] In this step, the system 100 obtains details of configured process (that is, Analytics Engine) along with its parameter (that is, Python). Using this parameter, the system 100 locates configuration file containing details (such as python.exe - executable file path, library path) of analytics engine.
b. Extraction of configured rule
[038] In this step, the system 100 extracts details of configured analytics algorithm (that is, Logistic Regression). Using this rule name, the system 100 locates corresponding template script automatically. This template script is passed as input to the analytical tool.
c. Formation of Command line arguments
[039] In this step, the system 100 extracts rule parameters. For example, input column names, output column of logistic regression algorithm are extracted. The system 100 also constructs command line arguments for these rule parameters.
d. Invocation of analytics tool in command line
[040] In this step, the system 100 invokes the analytics tool (that is, Python) in command line mode with the (Python) script for corresponding rule (that is, Logistic Regression) and its parameters as input. After execution, python analytical tool generates required models and outputs.
e. Processing of generated output
[041] In this step, the system 100 reads outputs (for example, model summary) generated by analytical tool at step 216, and the system displays the outputs on UI.
[042] The job flow creation and execution being performed by the system 100, for a given input, supports interpretation of the processes at a generic level, irrespective of format of input data, thereby allowing the system 100 to act as an adaptor supporting interfacing between the different test systems working on the different technology platforms.

[043] In an embodiment, various steps involved in the process 200 may be performed in the same order as depicted in FIG. 2, or in any alternate order that is technically feasible. In another embodiment, one or more steps in method 200 may be omitted.
[044] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[045] The embodiment of present disclosure herein addresses unresolved problem of process flow configuration. The embodiment, thus provides a mechanism for service configuration.
[046] It is to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g., any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g., hardware means like e.g., an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g., an ASIC and an FPGA, or at least one microprocessor and at least one memory with software processing components located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g., using a plurality of CPUs.

[047] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various components described herein may be implemented in other components or combinations of other components. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[048] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
[049] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or

stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[050] It is intended that the disclosure and examples be considered as exemplary only, with a true scope of disclosed embodiments being indicated by the following claims.

We Claim:
1. A system (100), comprising:
one or more hardware processors (104);
an I/O interface (106); and
a memory (102) storing a plurality of instructions, wherein the plurality of
instructions when executed, cause the one or more hardware processors to:
receive values of a plurality of analytical parameters, as input data;
populate a plurality of tables using the input data;
determine relationship among the plurality of analytical parameters,
by applying an analytical technique on the plurality of analytical
parameters;
establish relationship among the plurality of tables, based on the
determined relationship among the plurality of analytical
parameters;
construct a template script for the input data and invoke the template
script through a plurality of parameters;
perform job flow configuration for the input data, wherein
performing the job flow configuration comprises generating a job
flow for the input data;
execute the job flow to invoke a corresponding analytical tool and at
least one data model matching the analytical tool; and
generate results of execution of the job flow.
2. The system as claimed in claim 1, wherein the system performs the job flow
configuration by creating an instance of a configured rule for an analytical
engine process, comprising:
extracting information on the input data, a data structure, a process name, and a process instance ID, from the plurality of tables; extracting information on a process associated with the job flow, from one of the plurality of tables;

extracting information on a step instance, from one of the plurality
of tables;
extracting information on service URL details corresponding to the
process, from a master table from among the plurality of tables;
constructing a service contract using the extracted information on
the input data, the data structure, the process name, the process
instance ID, the information on the process associated with the job
flow, the step instance, and the information on the service URL
details corresponding process; and
executing the service contract using one or more Application
Programming Interfaces (API).
3. A processor implemented method (200), comprising:
receiving (202) values of a plurality of analytical parameters, as input
data, via one or more hardware processors;
populating (204) a plurality of tables using the input data, via the one
or more hardware processors;
determining (206) relationship among the plurality of analytical
parameters, by applying an analytical technique on the plurality of
analytical parameters, via the one or more hardware processors;
establishing (208) relationship among the plurality of tables, based on
the determined relationship among the plurality of analytical
parameters, via the one or more hardware processors;
constructing (210) a template script for the input data and invoking
the template script through a plurality of parameters, via the one or
more hardware processors;
performing (212) job flow configuration for the input data, wherein
performing the job flow configuration comprises generating a job flow
for the input data, via the one or more hardware processors;

executing (214) the job flow to invoke a corresponding analytical tool and at least one data model matching the analytical tool, via the one or more hardware processors; and
generate (216) results of execution of the job flow, via the one or more hardware processors.
4. The method as claimed in claim 3, wherein performing the job flow configuration comprises creating an instance of a configured rule for an analytical engine process, by:
extracting (302) information on the input data, a data structure, a
process name, and a process instance ID, from the plurality of tables;
extracting (304) information on a process associated with the job
flow, from one of the plurality of tables;
extracting (306) information on a step instance, from one of the
plurality of tables;
extracting (308) information on service URL details corresponding
to the process, from a master table from among the plurality of
tables;
constructing (310) a service contract using the extracted information
on the input data, the data structure, the process name, the process
instance ID, the information on the process associated with the job
flow, the step instance, and the information on the service URL
details corresponding process; and
executing (312) the service contract using one or more Application
Programming Interfaces (API).

Documents

Application Documents

# Name Date
1 202121039639-STATEMENT OF UNDERTAKING (FORM 3) [01-09-2021(online)].pdf 2021-09-01
2 202121039639-REQUEST FOR EXAMINATION (FORM-18) [01-09-2021(online)].pdf 2021-09-01
3 202121039639-PROOF OF RIGHT [01-09-2021(online)].pdf 2021-09-01
4 202121039639-FORM 18 [01-09-2021(online)].pdf 2021-09-01
5 202121039639-FORM 1 [01-09-2021(online)].pdf 2021-09-01
6 202121039639-FIGURE OF ABSTRACT [01-09-2021(online)].jpg 2021-09-01
7 202121039639-DRAWINGS [01-09-2021(online)].pdf 2021-09-01
8 202121039639-DECLARATION OF INVENTORSHIP (FORM 5) [01-09-2021(online)].pdf 2021-09-01
9 202121039639-COMPLETE SPECIFICATION [01-09-2021(online)].pdf 2021-09-01
10 202121039639-FORM-26 [21-10-2021(online)].pdf 2021-10-21
11 Abstract1.jpg 2021-11-23
12 202121039639-FER.pdf 2023-10-27
13 202121039639-OTHERS [04-03-2024(online)].pdf 2024-03-04
14 202121039639-FER_SER_REPLY [04-03-2024(online)].pdf 2024-03-04
15 202121039639-DRAWING [04-03-2024(online)].pdf 2024-03-04
16 202121039639-CLAIMS [04-03-2024(online)].pdf 2024-03-04
17 202121039639-ABSTRACT [04-03-2024(online)].pdf 2024-03-04

Search Strategy

1 SearchHistory_202121039639E_21-09-2023.pdf