Abstract: ABSTRACT A SYSTEM AND METHOD FOR PROCESSING A QUERY The present invention relates to a system (108) and a method (400) for processing a query. The system (108) is configured to perform an orchestration, a provisioning of an order/query received for execution along with an inventory management of the order/query. The system (108) includes a number of modules which operates as a single unit. The number of modules includes an orchestration manager (220), a provisioning manager (222), an AI/ML module (226) and an inventory module (224). Ref. Fig. 2
DESC:
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
1. TITLE OF THE INVENTION
A SYSTEM AND METHOD FOR PROCESSING A QUERY
2. APPLICANT(S)
NAME NATIONALITY ADDRESS
JIO PLATFORMS LIMITED INDIAN OFFICE-101, SAFFRON, NR. CENTRE POINT, PANCHWATI 5 RASTA, AMBAWADI, AHMEDABAD 380006, GUJARAT, INDIA
3.PREAMBLE TO THE DESCRIPTION
THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE NATURE OF THIS INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.
FIELD OF THE INVENTION
[0001] The present invention relates to the field of wireless communication system, more particularly relates to a system and a method for processing a query.
BACKGROUND OF THE INVENTION
[0002] Traditionally, capabilities of an orchestration manager, a provision manager, and an inventory manager are limited in number in a management system. Separate systems are required for an orchestration process, a provisioning process and an inventory management. An orchestration manager performs the orchestration process, a provisioning manager performs the provisioning process, and the inventory manager performs the inventory management process. Due to these separate systems for each of the process, the task of processing of a query is time consuming which affects the throughput associated with these systems while processing the query. The separate systems require high maintenance to provide a high throughput. Further, the system may not be able to handle a large amount of load.
[0003] In order to handle a large amount of load, and to overcome the above-mentioned drawbacks, accordingly an improved method and system is required for processing of a query in a network.
SUMMARY OF THE INVENTION
[0004] One or more embodiments of the present disclosure provide a method and system for processing a query in a network.
[0005] In one aspect of the present invention, a system for processing a query in a network is disclosed. The system includes a transceiver configured to receive a query from a North Bound Interface (NBI). The system further includes a provisioning manager configured to perform a workflow execution pertaining to the query received from the NBI by changing a state of the query from a first state to a second state. The system further includes an orchestration manager configured to perform an orchestration on the second state of the query. The orchestration manager is further configured to maintain the orchestrated data pertaining to the query in a database. The inventory manager is configured to dynamically model the data pertaining to the query into multiple workflows based on a user defined hierarchy.
[0006] In one embodiment, the system further includes a user interface module configured to allow a user to access historical queries stored in a database along with the workflow created thereof.
[0007] In another embodiment, the user interface module is further configured to allow the user to create a plurality of workflows.
[0008] In yet another embodiment, the user interface module is further configured to allow the user to configure a plurality of modules including at least one of an orchestration manager, a provisioning manager, an inventory module and an Artificial Intelligence/Machine Learning (AI/ML) module.
[0009] In yet another embodiment, the user interface module is further configured to fetch the query from a database and a cache data store and display the query on a display based on a user request.
[0010] In yet another embodiment, the system further includes an integration module configured to integrate the plurality of modules to provide a single Fulfillment Management System (FMS). Each module is configured to perform at least one specific task to process the query.
[0011] In another aspect of the present invention, a method of processing a query in a network is disclosed. The method includes the steps of identifying a request transmitted from a client to a server via a network protocol connection with a stream identifier. The method further includes the steps of receiving a query from a North Bound Interface (NBI). The method further includes the steps of performing a workflow execution pertaining to the query received from the NBI by changing a state of the query from a first state to a second state. The method further includes the steps of performing an orchestration on the second state of the query. The method further includes the steps of maintaining the orchestrated data pertaining to the query in a database.
[0012] In one embodiment, the processing of the query includes at least one of, insertion, deletion, updation and omission of the query.
[0013] In another embodiment, the query includes at least one of, a command and an order.
[0014] In yet another embodiment, the step of performing a workflow execution pertaining to the query received from the NBI by changing a state of the query from a first state to a second state, includes the steps of defining a relationship between a response received from a South Bound Interface (SBI) with the first state of the query and mapping the defined relationship between the response and the first state to the second state, thereby changing the state of the query from the first state to the second state.
[0015] In yet another embodiment, the orchestration performed on the second state of the query includes at least one of, arranging, scheduling and integrating tasks to optimize the workflow related to the query.
[0016] Other features and aspects of this invention will be apparent from the following description and the accompanying drawings. The features and advantages described in this summary and in the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the relevant art, in view of the drawings, specification, and claims hereof. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes and may not have been selected to delineate or circumscribe the inventive subject matter, resort to the claims being necessary to determine such inventive subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components, electronic components or circuitry commonly used to implement such components.
[0018] FIG. 1 is an exemplary block diagram of an environment for processing a query in a network, according to one or more embodiments of the present invention;
[0019] FIG. 2 is an exemplary block diagram of the system for processing a query in a network, according to one or more embodiments of the present invention;
[0020] FIG. 3 is an exemplary flow diagram of the system of FIG. 2, according to one or more embodiments of the present invention; and
[0021] FIG. 4 is a flow diagram of a method of processing a query in a network, according to one or more embodiments of the present invention.
[0022] The foregoing shall be more apparent from the following detailed description of the invention.
DETAILED DESCRIPTION OF THE INVENTION
[0023] Some embodiments of the present disclosure, illustrating all its features, will now be discussed in detail. It must also be noted that as used herein and in the appended claims, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise.
[0024] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure including the definitions listed here below are not intended to be limited to the embodiments illustrated but is to be accorded the widest scope consistent with the principles and features described herein.
[0025] A person of ordinary skill in the art will readily ascertain that the illustrated steps detailed in the figures and here below are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0026] FIG. 1 illustrates an exemplary block diagram of an environment 100 for processing a query in a network 106, according to one or more embodiments of the present disclosure. In this regard, the environment 100 includes a User Equipment (UE) 102, a server 104, the network 106 and a system 108 communicably coupled to each other for processing a query.
[0027] As per the illustrated embodiment and for the purpose of description and illustration, the UE 102 includes, but not limited to, a first UE 102a, a second UE 102b, and a third UE 102c, and should nowhere be construed as limiting the scope of the present disclosure. Accordingly, in alternate embodiments, the UE 102 may include a plurality of UEs as per the requirement. For ease of reference, each of the first UE 102a, the second UE 102b, and the third UE 102c, will hereinafter be collectively and individually referred to as the “User Equipment (UE) 102”.
[0028] In an embodiment, the UE 102 is not limited to, any electrical, electronic, electro-mechanical or an equipment and a combination of one or more of the above devices such as virtual reality (VR) devices, augmented reality (AR) devices, laptop, a general-purpose computer, desktop, personal digital assistant, tablet computer, mainframe computer, or any other computing device.
[0029] The environment 100 includes the server 104 accessible via the network 106. The server 104 may include by way of example but not limitation, one or more of a standalone server, a server blade, a server rack, a bank of servers, a server farm, hardware supporting a part of a cloud service or system, a home server, hardware running a virtualized server, one or more processors executing code to function as a server, one or more machines performing server-side functionality as described herein, at least a portion of any of the above, some combination thereof. In an embodiment, the entity may include, but is not limited to, a vendor, a network operator, a company, an organization, a university, a lab facility, a business enterprise side, a defence facility side, or any other facility that provides service.
[0030] The network 106 includes, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, or some combination thereof. The network 106 may include, but is not limited to, a Third Generation (3G), a Fourth Generation (4G), a Fifth Generation (5G), a Sixth Generation (6G), a New Radio (NR), a Narrow Band Internet of Things (NB-IoT), an Open Radio Access Network (O-RAN), and the like.
[0031] The network 106 may also include, by way of example but not limitation, at least a portion of one or more networks having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, waves, voltage or current levels, some combination thereof, or so forth. The network 106 may also include, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, a VOIP or some combination thereof.
[0032] The environment 100 further includes the system 108 communicably coupled to the server 104 and the UE 102 via the network 106. The system 108 is configured to process the query in the network 106. As per one or more embodiments, the system 108 is adapted to be embedded within the server 104 or embedded as an individual entity. However, for the purpose of description, the system 108 is described as an integral part of the server 104, without deviating from the scope of the present disclosure.
[0033] Operational and construction features of the system 108 will be explained in detail with respect to the following figures.
[0034] FIG. 2 is an exemplary block diagram of the system 108 for processing the query in the network 106 (as shown in FIG. 1), according to one or more embodiments of the present invention.
[0035] As per the illustrated embodiment, the system 108 includes one or more processors 202, a memory 204, a user interface module 206, an integration module 208, a transceiver 210, a database 212, a cache data store 214 and a display 216. For the purpose of description and explanation, the description will be explained with respect to one processors 202 and should nowhere be construed as limiting the scope of the present disclosure. In alternate embodiments, the system 108 may include more than one processors 202 as per the requirement of the network 106. The one or more processors 202, hereinafter referred to as the processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, single board computers, and/or any devices that manipulate signals based on operational instructions.
[0036] As per the illustrated embodiment, the processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 204. The memory 204 is configured to store one or more computer-readable instructions or routines in a non-transitory computer-readable storage medium, which may be fetched and executed to create or share data packets over a network service. The memory 204 may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as disk memory, EPROMs, FLASH memory, unalterable memory, and the like.
[0037] In an embodiment, the user interface module 206 includes a variety of interfaces, for example, a graphical user interface, a web user interface, a Command Line Interface (CLI), and the like. The user interface module 206 facilitates communication of the system 108. In one embodiment, the user interface module 206 provides a communication pathway for one or more components of the system 108. Examples of such components include, but are not limited to, the UE 102 (as shown in FIG. 1) and the database 212.
[0038] In an alternate embodiment, the user interface module 206 is configured to allow a user to create a plurality of workflows. The user interface module 206 is further configured to allow the user to access historical queries stored in a database 212 along with the workflow created thereof.
[0039] In an embodiment, the integration module 208 is one of, but not limited to, an Application Programming Interface (API), a North Bound Interface (NBI), a South Bound Interface (SBI). The integration module 208 is configured to integrate a plurality of modules included in the processor 202. Each module included in the processor 202 is configured to perform at least one specific task to process the query.
[0040] In an embodiment, the transceiver 210 is configured to receive a query from a North Bound Interface (NBI). The transceiver 210 is integrated within the system 108 or maybe connected externally.
[0041] The database 212 and cache data store 214 is one of, but not limited to, a centralized database, a cloud-based database, a commercial database, an open-source database, a distributed database, an end-user database, a graphical database, a No-Structured Query Language (NoSQL) database, an object-oriented database, a personal database, an in-memory database, a document-based database, a time series database, a wide column database, a key value database, a search database, a cache databases, and so forth. The foregoing examples of database 212 types are non-limiting and may not be mutually exclusive e.g., a database can be both commercial and cloud-based, or both relational and open-source, etc.
[0042] In an embodiment, the display 216 is configured to display the query which is fetched from the database 212 based on a user’s request. The display 216 is implemented using at least one of, but not limited to, LCD display technology, OLED display technology, and/or other types of conventional display technology. The display 216 is integrated within the system 108 or maybe a separate entity from the system 108.
[0043] As per various embodiments depicted, the system 108 to process the query in the network 106 is a Fulfillment Management System (FMS). The FMS is configured with a capability to perform an orchestration process, a provisioning process, and an inventory management. In order for the system 108 to process the query in the network 106, the processor 202 includes one or more modules. In one embodiment, the one or more modules include, but not limited to an orchestration manager 220, a provisioning manager 222, an inventory module 224 and an AI/ML module 226 communicably coupled to each other to process the query in the network 106.
[0044] In an embodiment, the system 108 which is the FMS includes the orchestration manager 220, the provisioning manager 222, the inventory module 224, and the AI/ML module 226 are all combined into a single architecture in order to operate as a single system. In an embodiment of the present subject matter, using a Graphical User Interface (GUI) associated with the FMS, any workflow associated with the query is orchestrated. In an embodiment, the FMS is configurable by a user using a Graphical User Interface (GUI) in which the user can configure at least one of, an orchestration process, a provisioning process, and thereby manage an inventory.
[0045] The orchestration manager 220, the provisioning manager 222, inventory module 224, and an AI/ML module 226 in an exemplary embodiment, are implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor 202. In the examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor 202 may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processor may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the memory 204 may store instructions that, when executed by the processing resource, implement the processor 202. In such examples, the system 108 may comprise the memory 204 storing the instructions and the processing resource to execute the instructions, or the memory 204 may be separate but accessible to the system 108 and the processing resource. In other examples, the processor 202 may be implemented by electronic circuitry.
[0046] In one embodiment, the orchestration manager 220 of the processor 202 is configured to receive a query from a North Bound Interface (NBI). The query includes an order, a command or the like. In an alternate embodiment, the processor 202 receives an order from the NBI. Examples of the order include, but are not limited to, an insertion, a deletion, an update, and an omission. The orchestration manager 220 of the processor 202 is further configured to perform an orchestration of the query. The orchestration of the query performed by the orchestration manager 220 includes at least one of, arranging, scheduling and integrating multiple tasks to optimize the workflow related to the query. For example, the orchestration manager 220 arranges and schedules multiple states of the query for executing the workflow such as a first state of the query is executed first, thereafter a second state of the query is executed.
[0047] In one embodiment, the provisioning manager 222 of the processor 202 is configured to perform a workflow execution and provisioning related to the query. The number of workflows are executed for provisioning by the provisioning manager 222.
[0048] In one embodiment, inventory module 224 of the processor 202 is configured to maintain an inventory associated with the query. Further, using inventory module 224, any data pertaining to the query is dynamically modeled into multiple workflows based on a user defined hierarchy. In an exemplary embodiment, the data pertaining to the query may be azure data. In an alternate embodiment, the inventory module 224 is configured to maintain an inventory of details associated with the execution of the order.
[0049] In one embodiment, AI/ML module 226 is integrated within the system 108 or maybe connected externally.
[0050] FIG. 3 is an exemplary flow diagram 300 of the system of FIG. 2, according to one or more embodiments of the present invention.
[0051] For the purpose of description of the exemplary embodiment as illustrated in FIG. 3, the User Equipment (UE) 102 uses network protocol connection to communicate with the system 108. Accordingly, the UE 102 is configured to transmit a query to the system 108 via the one or more interfaces.
[0052] In an embodiment, the network protocol connection is the establishment and management of communication between the UE 102 and the system 108 over the network 106 (as shown in FIG. 1) using a specific protocol or set of protocols. The network protocol connection includes, but not limited to, Transmission Control Protocol (TCP), User Datagram Protocol (UDP), File Transfer Protocol (FTP), Hypertext Transfer Protocol (HTTP), Simple Network Management Protocol (SNMP), Internet Control Message Protocol (ICMP), Hypertext Transfer Protocol Secure (HTTPS) and Terminal Network (TELNET).
[0053] In an alternate embodiment, the one or more interfaces, include at least one of, but not limited to, an Application Programming Interface (API), a North Bound Interface (NBI), South Bound Interface (SBI), which integrates the UE 102 and the system 108.
[0054] In an embodiment, UE 102 includes one or more primary processors 302, a memory 304, and a user interface module 306. In alternate embodiments, the UE 102 may include more than one primary processors 302 as per the requirement of the network 106. The one or more primary processors 302, may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, single board computers, and/or any devices that manipulate signals based on operational instructions.
[0055] In an embodiment, the one or more primary processors 302 is configured to fetch and execute computer-readable instructions stored in the memory 304. The memory 304 may be configured to store one or more computer-readable instructions or routines in a non-transitory computer-readable storage medium, which may be fetched and executed to create or share data packets over a network service. The memory 304 may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as disk memory, EPROMs, FLASH memory, unalterable memory, and the like.
[0056] In an embodiment, the user interface module 306 of the UE 102 includes a variety of interfaces, for example, a graphical user interface, a web user interface, a Command Line Interface (CLI), and the like. The user interface module 306 is configured to allow a user to create a query/command/order. The query created by the user is transmitted to processor 202 of the system 108 via interface for example, NBI. Further, the user interface module 306 allows the user to create a plurality of workflows. The user interface module 306 is configured to allow the user to access historical queries stored in the database 212 along with the workflow created thereof.
[0057] In an alternate embodiment, the system 108 allows the user to create a query/command/order on the user interface module 206 of the system 108.
[0058] In an embodiment, the system 108 receives the query transmitted by the one or more primary processors 302 of the UE 102. The system 108 advantageously has the capability of utilizing the orchestration manager 220, the provisioning manager 222, and the inventory module 224, and the AI/ML module 226, to function as a single system. In particular, the provisioning manager 222 of the system 108 performs a workflow execution pertaining to the query received from the UE 102 via the NBI by changing a state of the query from a first state to a second state. The orchestration manager 220 of the system 108 performs an orchestration on the second state of the query and maintains the orchestrated data pertaining to the query in the database 212. Due to the combined orchestration process, provisioning process, and inventory management in the system 108, there is no need of high maintenance for the system 108 as there are not separate systems for the orchestration process, provisioning process, and inventory management respectively. Advantageously due to the combined orchestration manager 220, the provisioning manager 222, and the inventory module 224 the throughput of the system 108 is increased.
[0059] The user interface module 306 of the UE 102 is further configured to allow the user to configure a plurality of modules including at least one of, the orchestration manager 220, the provisioning manager 222, the inventory module 224 and the AI/ML module 226. Based on the user’s request, the user interface module 306 fetches the details of the query from the database 212 and the cache data store 214 and displays the details of the query on the display of the UE 102.
[0060] In an alternate embodiment, the flow diagram 300 depicts operations and management of the system 108. The user interface module 206 associated with the system 108 is utilized by the user to create the query and the number of workflows. The number of workflows are executed for provisioning by the provisioning manager 222 as referred in the FIG. 2. Further, upon receiving the query for execution, the system 108 executes the query and store details associated with the query in the database 212 and the cache data store 214. Further, based on the user request, the user interface module 206 of the system 108 fetches the details from the database 212 and the cache data store 214 and displays the details of the query on the display 216 of the system 108.
[0061] FIG. 4 is a flow diagram of a method 400 for processing the query in the network 106, according to one or more embodiments of the present invention. For the purpose of description, the method is described with the embodiments as illustrated in FIG. 2 and should nowhere be construed as limiting the scope of the present disclosure.
[0062] At step 402, the method 400 includes the step of receiving a query from a North Bound Interface (NBI). In one embodiment, the transceiver 210 is configured to receive a query from a North Bound Interface (NBI). In an embodiment, the query is at least one of, a command and an order.
[0063] At step 404, the method 400 includes the step of performing a workflow execution pertaining to the query received from the NBI by changing a state of the query from a first state to a second state. For example, the workflow has multiple states such as the first state and the second state and these states get changed from the first state to the second state subsequent to the completion of the execution of the first state. By executing these states in a predefined order, the workflow pertaining to the query is executed. In one embodiment, the provisioning manager 222 of the processor 202 is configured to perform a workflow execution pertaining to the query received from the NBI by defining a relationship between a response received from a South Bound Interface (SBI) with the first state of the query and mapping the defined relationship between the response and the first state to the second state, thereby changing the state of the query from the first state to the second state, firing a particular Application Programming Interface (API) of a particular network mode. For example, the user interface module 206 of the system 108 or the user interface module 306 of the UE 102 (as shown in FIG. 3) facilitates user to define a relationship between the first state and the second state utilizing a response code of SBI received after the execution of at least one of, the first state and the second state. The defined relationship is mapped between the first state and the second state such that subsequent to the completion of the execution of the first state, the execution of the second state is initiated based on the defined relationship mapped between the first state and the second state.
[0064] At 406, the method 400 includes the step of performing an orchestration on the second state of the query. In one embodiment, the orchestration manager 220 of the processor 202 is configured to perform an orchestration on the second state of the query. The orchestration performed on the second state of the query includes at least one of, arranging, scheduling and integrating tasks to optimize the workflow related to the query.
[0065] At 408, the method 400 includes the step of maintaining the orchestrated data pertaining to the query in a database 212. In one embodiment, the orchestration manager 220 of the processor 202 is configured to maintain the orchestrated data pertaining to the query in the database 212.
[0066] The present invention further discloses a non-transitory computer-readable medium having stored thereon computer-readable instructions. The computer-readable instructions are executed by the processor 202. The processor 202 is configured to receive a query from a North Bound Interface (NBI). The processor 202 is further configured to perform a workflow execution pertaining to the query received from the NBI by changing a state of the query from a first state to a second state. The processor 202 is further configured to perform an orchestration on the second state of the query. The processor 202 is further configured to maintain the orchestrated data pertaining to the query in a database 212.
[0067] A person of ordinary skill in the art will readily ascertain that the illustrated embodiments and steps in description and drawings (FIG.1-4) are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0068] The present disclosure incorporates technical advancement of the Fulfillment Management System (FMS) having capability of an orchestration manager, a provisioning manager, an AI/ML module, and an inventory manager in a same architecture which operates as a single system. Further, the throughput of the FMS is increased as the FMS is based on a single application which enables lower maintenance for the FMS.
[0069] The present invention offers multiple advantages over the prior art and the above listed are a few examples to emphasize on some of the advantageous features. The listed advantages are to be read in a non-limiting manner.
REFERENCE NUMERALS
[0070] Environment- 100
[0071] User Equipment (UE)- 102
[0072] Server- 104
[0073] Network- 106
[0074] System -108
[0075] Processor- 202
[0076] Memory- 204
[0077] User Interface module- 206
[0078] Integration module- 208
[0079] Transceiver- 210
[0080] Database- 212
[0081] Cache data store- 214
[0082] Display- 216
[0083] Orchestration manager- 220
[0084] Provisioning manager - 222
[0085] Inventory module- 224
[0086] AI/ML module- 226
[0087] Primary processor- 302
[0088] Memory- 304
[0089] User Interface Module- 306
,CLAIMS:CLAIMS
We Claim:
1. A method (400) for processing a query, the method (400) comprises the steps of:
receiving (402), by one or more processors (202), a query from a North Bound Interface (NBI);
performing (404), by the one or more processors (202), a workflow execution pertaining to the query received from the NBI by changing a state of the query from a first state to a second state;
performing (406), by the one or more processors (202), an orchestration on the second state of the query; and
maintaining (408), by the one or more processors (202), the orchestrated data pertaining to the query in a database (212).
2. The method (400) as claimed in claim 1, wherein processing of the query includes at least one of, insertion, deletion, updation and omission of the query.
3. The method (400) as claimed in claim 1, wherein the query includes at least one of, a command and an order.
4. The method (400) as claimed in claim 1, wherein the step of, performing, a workflow execution pertaining to the query received from the NBI by changing a state of the query from a first state to a second state, includes the steps of:
defining, by the one or more processors (202), a relationship between a response received from a South Bound Interface (SBI) with the first state of the query;
mapping, by the one or more processors (202), the defined relationship between the response and the first state to the second state, thereby changing the state of the query from the first state to the second state.
5. The method (400) as claimed in claim 1, wherein the orchestration performed on the second state of the query includes at least one of, arranging, scheduling and integrating tasks to optimize the workflow related to the query.
6. A system (108) for a processing a query, the system (108) comprising:
a transceiver (210) configured to, receive, a query from a North Bound Interface (NBI);
a provisioning manager (222) configured to, perform, a workflow execution pertaining to the query received from the NBI by changing a state of the query from a first state to a second state;
an orchestration manager (220) configured to:
perform, an orchestration on the second state of the query; and
maintain, the orchestrated data pertaining to the query in a database.
7. The system (108) as claimed in claim 6, wherein the system (108) further comprises a user interface module (206) configured to, allow, a user to access historical queries stored in the database (212) along with the workflow created thereof.
8. The system (108) as claimed in claim 6, wherein the user interface module (206) is further configured to allow the user to create a plurality of workflows.
9. The system (108) as claimed in claim 6, wherein the user interface module (206) is further configured to, allow the user to configure a plurality of modules including at least one of, the orchestration manager (220), the provisioning manager (222), an inventory module (224) and an Artificial Intelligence/Machine Learning (AI/ML) module (226).
10. The system (108) as claimed in claim 9, wherein an integration module (208) of the system is (108) further configured to, integrate the plurality of modules to provide a single Fullfilment Management System (FMS), wherein each module configured to perform at least one specific task to process the query.
11. The system (108) as claimed in claim 6, wherein the system (108) is further configured to, fetch the query from the database (212) and a cache data store (214) and display the query on a display (216) based on a user request.
12. The system (108) as claimed in clam 6, wherein the system (108) is further configured to, dynamically model the data pertaining to the query into multiple workflows based on a user defined hierarchy.
13. A User Equipment (UE) (102), comprising:
one or more primary processors (302) communicatively coupled to one or more processors (202), the one or more primary processors (302) coupled with a memory (304), wherein said memory (304) stores instructions which when executed by the one or more primary processors (302) causes the UE (102) to:
transmit, created one or more workflows to the one or more processors (202) in order to process a query; and
wherein the one or more processors (202) is configured to perform the steps as claimed in claim 1.
| # | Name | Date |
|---|---|---|
| 1 | 202321047844-STATEMENT OF UNDERTAKING (FORM 3) [15-07-2023(online)].pdf | 2023-07-15 |
| 2 | 202321047844-PROVISIONAL SPECIFICATION [15-07-2023(online)].pdf | 2023-07-15 |
| 3 | 202321047844-FORM 1 [15-07-2023(online)].pdf | 2023-07-15 |
| 4 | 202321047844-FIGURE OF ABSTRACT [15-07-2023(online)].pdf | 2023-07-15 |
| 5 | 202321047844-DRAWINGS [15-07-2023(online)].pdf | 2023-07-15 |
| 6 | 202321047844-DECLARATION OF INVENTORSHIP (FORM 5) [15-07-2023(online)].pdf | 2023-07-15 |
| 7 | 202321047844-FORM-26 [03-10-2023(online)].pdf | 2023-10-03 |
| 8 | 202321047844-FORM-26 [03-10-2023(online)]-1.pdf | 2023-10-03 |
| 9 | 202321047844-Proof of Right [08-01-2024(online)].pdf | 2024-01-08 |
| 10 | 202321047844-DRAWING [13-07-2024(online)].pdf | 2024-07-13 |
| 11 | 202321047844-COMPLETE SPECIFICATION [13-07-2024(online)].pdf | 2024-07-13 |
| 12 | Abstract-1.jpg | 2024-08-28 |
| 13 | 202321047844-Power of Attorney [21-10-2024(online)].pdf | 2024-10-21 |
| 14 | 202321047844-Form 1 (Submitted on date of filing) [21-10-2024(online)].pdf | 2024-10-21 |
| 15 | 202321047844-Covering Letter [21-10-2024(online)].pdf | 2024-10-21 |
| 16 | 202321047844-CERTIFIED COPIES TRANSMISSION TO IB [21-10-2024(online)].pdf | 2024-10-21 |
| 17 | 202321047844-FORM 3 [02-12-2024(online)].pdf | 2024-12-02 |
| 18 | 202321047844-FORM 18 [20-03-2025(online)].pdf | 2025-03-20 |