Sign In to Follow Application
View All Documents & Correspondence

Method And System For Offset Provisioning Of Records

Abstract: ABSTRACT METHOD AND SYSTEM FOR OFFSET PROVISIONING OF RECORDS The present invention relates to a system (108) and a method (400) for offset provisioning of records in a network (106). The method (400) includes steps of, fetching relevant data pertaining to the records from a database (216) based on one or more report templates and a first offset number provided by a user during a live report execution session. Further, transmitting the one or more report templates to a trained model (218) for analysis after the completion of the live report execution session and determining utilizing the trained model (218) an updated second optimal offset number for each of the one or more report templates based on the analysis. Thereafter, replacing the first offset number provided by the user with the updated second optimal offset number. By incorporating offset provisioning, the invention offers efficient and reliable solution for generating accurate and timely live reports in real-time applications. Ref. Fig. 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 July 2023
Publication Number
04/2025
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

JIO PLATFORMS LIMITED
OFFICE-101, SAFFRON, NR. CENTRE POINT, PANCHWATI 5 RASTA, AMBAWADI, AHMEDABAD - 380006, GUJARAT, INDIA

Inventors

1. Aayush Bhatnagar
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
2. Ankit Murarka
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
3. Jugal Kishore Kolariya
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
4. Gaurav Kumar
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
5. Kishan Sahu
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
6. Rahul Verma
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
7. Sunil Meena
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
8. Gourav Gurbani
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
9. Sanjana Chaudhary
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
10. Chandra Kumar Ganveer
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
11. Supriya De
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
12. Kumar Debashish
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
13. Tilala Mehul
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
14. Gaurav Saxena
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
15. Mohit Bhanwria
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
16. Durgesh Kumar
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
17. Yogesh Kumar
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
18. Kunal Telgote
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
19. Manasvi Rajani
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
20. Kalikivayi Srinath
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India
21. Vitap Pandey
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad, Gujarat - 380006, India

Specification

DESC:
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003

COMPLETE SPECIFICATION
(See section 10 and rule 13)
1. TITLE OF THE INVENTION
METHOD AND SYSTEM FOR OFFSET PROVISIONING OF RECORDS
2. APPLICANT(S)
NAME NATIONALITY ADDRESS
JIO PLATFORMS LIMITED INDIAN OFFICE-101, SAFFRON, NR. CENTRE POINT, PANCHWATI 5 RASTA, AMBAWADI, AHMEDABAD 380006, GUJARAT, INDIA
3.PREAMBLE TO THE DESCRIPTION

THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE NATURE OF THIS INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.

FIELD OF THE INVENTION
[0001] The present invention relates to the field of wireless communication systems, more particularly relates to a method and system for offset provisioning of records.
BACKGROUND OF THE INVENTION
[0002] In modern applications, large volumes of data are processed in real time which are utilized for generating accurate timely reports and dashboards which can be challenging, especially during peak times or seasons. The influx of data during these periods can cause delays in data generation or late data transfer, leading to potential disruptions in live report generation and inaccurate insights into unexpected events.
[0003] Conventionally, applications have attempted to address these issues by either increasing the allocated resources for computation or relying on fixed report or dashboard, execution or any such data computation templates. The former approach involves using additional reserve resources as the need arises, which can be costly and inefficient. The latter approach utilizes predefined report templates, which may not account for delayed data generation or late data transfer, resulting in miscalculated performance metrics and potentially significant data loss.
[0004] Therefore, there is a need for an improved solution that ensures uninterrupted live report generation, optimizes resource utilization, and mitigates data loss in the face of unexpected events.
SUMMARY OF THE INVENTION
[0005] One or more embodiments of the present disclosure provide a method and system for offset provisioning of records.
[0006] In one aspect of the present invention, a method for offset provisioning of records in a network is disclosed. The method includes the step of fetching relevant data pertaining to the records from a database based on one or more report templates and a first offset number provided by a user during a live report execution session. The method further includes the step of transmitting the one or more report templates to a trained model for analysis after the completion of the live report execution session. The method further includes the step of determining utilizing the trained model, an updated second optimal offset number for each of the one or more report templates based on the analysis. The method further includes the step of replacing the first offset number provided by the user with the updated second optimal offset number.
[0007] In one embodiment, the records include data of at least one of, reports, dashboards, live report execution sessions and report templates.
[0008] In another embodiment, by utilizing the offset numbers including the first offset number and the updated second optimal offset number, the method ensures that the number of previous time intervals are excluded from report generation during the live report execution session.
[0009] In yet another embodiment, the trained model is at least one of, an Artificial Intelligence/Machine Learning (AI/ML) model.
[0010] In yet another embodiment, the data pertaining to one or more report templates and updated second optimal offset number are stored in the database.
[0011] In yet another embodiment, the step of, determining utilizing the trained model, an updated second optimal offset number for each of the one or more report templates based on the analysis, includes the steps of performing utilizing the trained model, a trend analysis and a pattern analysis pertaining to the one or more report templates; identifying a delayed time for each report of the one or more report templates based on the trend analysis and the pattern analysis; and in response to identifying the delayed time for each report of the one or more report templates, determining utilizing the trained model, the relevant updated second optimal offset number for each report of the one or more report templates based on the trend analysis and the pattern analysis.
[0012] In yet another embodiment, the updated second optimal offset number is utilized in one of a live report execution and for an upcoming report execution.
[0013] In yet another embodiment, the fetched relevant data is utilized by the one or more processors (202) for generating a live report.
[0014] In another aspect of the present invention, a system for offset provisioning of records in a network is disclosed. The system includes a computation engine configured to fetch relevant data pertaining to the records from a database based on one or more report templates and a first offset number provided by a user during a live report execution session. The system further includes a transceiver configured to transmit the one or more report templates to a trained model for analysis after the completion of the live report execution session. The system further includes a determination unit configured to determine utilizing the trained model, an updated second optimal offset number for each of the one or more report templates based on the analysis. The system further includes a replacing unit configured to replace the first offset number provided by the user with the updated second optimal offset number.
[0015] Other features and aspects of this invention will be apparent from the following description and the accompanying drawings. The features and advantages described in this summary and in the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the relevant art, in view of the drawings, specification, and claims hereof. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes and may not have been selected to delineate or circumscribe the inventive subject matter, resort to the claims being necessary to determine such inventive subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components, electronic components or circuitry commonly used to implement such components.
[0017] FIG. 1 is an exemplary block diagram of an environment for offset provisioning of records in a network, according to one or more embodiments of the present invention;
[0018] FIG. 2 is an exemplary block diagram of the system for offset provisioning of records in the network, according to one or more embodiments of the present invention;
[0019] FIG. 3 is an exemplary flow diagram of the system of FIG. 2, according to one or more embodiments of the present invention; and
[0020] FIG. 4 is a flow diagram of a method for offset provisioning of records in the network, according to one or more embodiments of the present invention.
[0021] FIG. 5 is an exemplary signal flow diagram for offset provisioning of records in the network, according to one or more embodiments of the present invention.
[0022] The foregoing shall be more apparent from the following detailed description of the invention.
DETAILED DESCRIPTION OF THE INVENTION
[0023] Some embodiments of the present disclosure, illustrating all its features, will now be discussed in detail. It must also be noted that as used herein and in the appended claims, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise.
[0024] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure including the definitions listed here below are not intended to be limited to the embodiments illustrated but is to be accorded the widest scope consistent with the principles and features described herein.
[0025] A person of ordinary skill in the art will readily ascertain that the illustrated steps detailed in the figures and here below are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0026] The present invention provides offset provisioning for live reports and dashboards in real-time applications, addressing the challenges of delayed data generation and late data transfer during peak times or seasons. By allowing users to specify an offset number in one or more report templates, the invention ensures uninterrupted live report generation, optimizes resource utilization, and mitigates data loss. Trained models are incorporated to predict unexpected events and determine the extent of these events, enabling flexible handling of such occurrences. The invention offers advantages such as smooth operation during peak events, optimized resource allocation, and accurate calculation of performance metrics.
[0027] Referring to FIG. 1, FIG. 1 illustrates an exemplary block diagram of an environment 100 for offset provisioning of records in a network 106, according to one or more embodiments of the present invention. The environment 100 includes, a User Equipment (UE) 102, a server 104, a network 106 and a system 108. The UE 102 aids a user to interact with the system 108 to transmit a first offset during live report execution session.
[0028] For the purpose of description and explanation, the description will be explained with respect to one or more user equipment’s (UEs) 102, or to be more specific will be explained with respect to a first UE 102a, a second UE 102b, and a third UE 102c, and should nowhere be construed as limiting the scope of the present disclosure. Each of the at least one UE 102 namely the first UE 102a, the second UE 102b, and the third UE 102c is configured to connect to the server 104 via the network 106.
[0029] In an embodiment, each of the first UE 102a, the second UE 102b, and the third UE 102c is one of, but not limited to, any electrical, electronic, electro-mechanical or an equipment and a combination of one or more of the above devices such as virtual reality (VR) devices, augmented reality (AR) devices, laptop, a general-purpose computer, desktop, personal digital assistant, tablet computer, mainframe computer, or any other computing device.
[0030] The network 106 includes, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, or some combination thereof. The network 106 may include, but is not limited to, a Third Generation (3G), a Fourth Generation (4G), a Fifth Generation (5G), a Sixth Generation (6G), a New Radio (NR), a Narrow Band Internet of Things (NB-IoT), an Open Radio Access Network (O-RAN), and the like.
[0031] The network 106 may also include, by way of example but not limitation, at least a portion of one or more networks having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, waves, voltage or current levels, some combination thereof, or so forth. The network 106 may also include, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, a VOIP or some combination thereof.
[0032] The environment 100 includes the server 104 accessible via the network 106. The server 104 may include by way of example but not limitation, one or more of a standalone server, a server blade, a server rack, a bank of servers, a server farm, hardware supporting a part of a cloud service or system, a home server, hardware running a virtualized server, a processor executing code to function as a server, one or more machines performing server-side functionality as described herein, at least a portion of any of the above, some combination thereof. In an embodiment, the entity may include, but is not limited to, a vendor, a network operator, a company, an organization, a university, a lab facility, a business enterprise side, a defence facility side, or any other facility that provides service.
[0033] The environment 100 further includes a system 108 communicably coupled to the server 104 and each of the first UE 102a, the second UE 102b, and the third UE 102c via the network 106. The system 108 is adapted to be embedded within the server 104 or is embedded as the individual entity. However, for the purpose of description, the system 108 is described as an integral part of the server 104, without deviating from the scope of the present disclosure.
[0034] Operational and construction features of the system 108 will be explained in detail with respect to the following figures.
[0035] FIG. 2 is an exemplary block diagram of the system 108 for offset provisioning of records in a network 106, according to one or more embodiments of the present invention.
[0036] As per the illustrated and preferred embodiment, the system 108 for offset provisioning of records in the network 106, the system 108 includes one or more processors 202, a memory 204, a database 216, Artificial Intelligence (AI) / Machine Learning (ML) model 218 and a load balancer 220. The one or more processors 202 includes a computation engine 206, a transceiver 208, a determination unit 210, and a replacing unit 212. The one or more processors 202, hereinafter referred to as the processor 202, may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, single board computers, and/or any devices that manipulate signals based on operational instructions. However, it is to be noted that the system 108 may include multiple processors as per the requirement and without deviating from the scope of the present disclosure. Among other capabilities, the processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 204.
[0037] As per the illustrated embodiment, the processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 204 as the memory 204 is communicably connected to the processor 202. The memory 204 is configured to store one or more computer-readable instructions or routines in a non-transitory computer-readable storage medium, which may be fetched and executed to create or share data packets over a network service. The memory 204 may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as disk memory, EPROMs, FLASH memory, unalterable memory, and the like.
[0038] In an embodiment, the load balancer 220 of the system 108 is configured to receive incoming data from one or more users. For example, the load balancer 220 may be a combination of hardware and software which distributes incoming network traffic pertaining to incoming data from one or more users across a group of one or more processors. The incoming data from the one or more users includes at least one of, but not limited to, offset numbers. The incoming data pertaining to the offset numbers is stored in a database 216. The load balancer 220 is further configured to distribute the incoming data across the computation engine 206 of the processor 202 to handle the load efficiently.
[0039] In an embodiment, the computation engine 206 of the processor 202 is configured to fetch relevant data pertaining to the records from the database 216 based on one or more report templates and a first offset number provided by a user during a live report execution session. In an embodiment, the offset number is used to exclude a specific number of previous time intervals from the computation, ensuring that delayed data generation or late data transfer does not affect the accuracy of the live reports. In one embodiment, the one or more report templates is a template such as a document that outlines how the user structures the data pertaining to the live reports. The one or more report templates include specific sections where the user can add unique content and customize to the live reports based on their needs. For example, whenever the user monitors the live reports on regular basis, the user may add the first offset number in the one or more report templates. The one or more report templates are stored in the database 216. In one embodiment, the computation engine 206 utilizes the fetched relevant data for generating a live report.
[0040] In one embodiment, the live report is any report related to the system 108 from which the user can get data related to the system 108. The live report may include data such as, at least one of, but not limited to, Key Performance Indicators (KPIs,) counters, and attributes. For example, the live report pertains to at least one of, a server utilization report. The live report is updated continuously based on the server utilization by the system 108.
[0041] In an embodiment, the data pertaining to the one or more report templates and the offset number are stored in the database 216. The database 216 is one of, but not limited to, a centralized database, a cloud-based database, a commercial database, an open-source database, a distributed database, an end-user database, a graphical database, a No-Structured Query Language (NoSQL) database, an object-oriented database, a personal database, an in-memory database, a document-based database, a time series database, a wide column database, a key value database, a search database, a cache databases, and so forth. The foregoing examples of database 212 types are non-limiting and may not be mutually exclusive e.g., the database can be both commercial and cloud-based, or both relational and open-source, etc.
[0042] In an embodiment, after the completion of the live report execution session, the transceiver 208 of the processor 202 is configured to transmit the one or more report templates to an Artificial Intelligence (AI) / Machine Learning (ML) model 218 for analysis. The AI/ML model 218 includes at least one of, an Artificial Intelligence (AI), and a Machine Learning (ML) model which utilizes multiple logics to train on the delayed time for each report in every interval. This analysis facilitates determining an updated optimal offset number for each report, ensuring that unnecessary higher offset numbers are not utilized. In one embodiment, the multiple logics includes at least one of, but not limited to, a k-means clustering, a hierarchical clustering, a Principal Component Analysis (PCA), an Independent Component Analysis (ICA), a deep learning logics such as Artificial Neural Networks (ANNs), a Convolutional Neural Networks (CNNs), a Recurrent Neural Networks (RNNs), a Long Short-Term Memory Networks (LSTMs), a Generative Adversarial Networks (GANs), a Q-Learning, a Deep Q-Networks (DQN), a Reinforcement Learning Logics etc. The multiple logics facilitates the AI/ML model 218 in understanding the trends and patterns related to the delayed time for each report.
[0043] In one embodiment, the AI/ML model 218 is a trained model. In particular, the trained model is a model to identify patterns, relationships, or trends within a given dataset in order to generate meaningful insights. The trained model such as the AI/ML model 218 is used to run the input data through the multiple logics to correlate the output generated by the AI/ML model 218 against the sample output. While training, the AI/ML model 218 performs trends analysis and pattern analysis in order to learns at least one of, but not limited to, trends, patterns and behavior related to the delayed time for each report of the one or more report templates. In one embodiment, the system 108 selects an appropriate AI/ML model 218 from a set of available options of the AI/ML models. Thereafter, the selected AI/ML model 218 is trained using the historical data pertaining to the delayed time for each report of the one or more report templates.
[0044] In one embodiment, the AI/ML model 214 is configured to train itself utilizing the fetched relevant data pertaining to the records from the database 216. In particular, the records include the relevant data related to at least one of, reports, dashboards, live report execution sessions and report templates. For example, the reports are one or more live reports of the system 108 which are monitored by the user on a regular basis such as daily. In one embodiment, the AI/ML model 214 is trained utilizing the data present on the dashboard of the user during the live report execution sessions. For example, the dashboard is a data visualization and an analysis tool that displays on one screen the status of KPIs and other important metrics. In one embodiment, the AI/ML model 214 is trained utilizing the data related to the live report execution sessions which includes the delayed time for each report of the one or more report templates and the first offset number.
[0045] In an alternate embodiment, after the completion of the live report execution session, the output generated by the AI/ML model 218 is again fed back to the AI/ML model 214 by the processor 202 based on the output generated, the AI/ML model 218 is again trained. In particular, after the completion of the live report execution session, the AI/ML model 218 keeps on training and updating itself in order to achieve better output.
[0046] In an embodiment, the determination unit 210 of the processor 202 is configured to determine an updated second optimal offset number for each of the one or more report templates based on the analysis utilizing the trained model such as AI/ML model 218 by performing a trend analysis and a pattern analysis pertaining to the one or more report templates and identifies the delayed time for each report of the one or more report templates based on the trend analysis and the pattern analysis. Subsequent to identifying the delayed time for each report of the one or more report templates, the determination unit 210 determines the relevant updated second optimal offset number for each report of the one or more report templates. The data pertaining to the one or more report templates and the updated second optimal offset number are stored in the database 216. This updated second optimal offset number is utilized in the live report execution and for an upcoming report execution.
[0047] In an embodiment, the replacing unit 212 of the processor 202 is configured to replace the first offset number provided by the user with the updated second optimal offset number. In subsequent iterations of live report execution, the updated second offset number is retrieved from the database 216 and utilized in place of the first offset number. The iterative process of offset provisioning and analysis performed by the AI/ML model 218 ensures continuous optimization and accurate live report generation.
[0048] The computation engine 206, the transceiver unit 208, the determination unit 210, and the replacing unit 212, in an exemplary embodiment, are implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor 202. In the examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor 202 may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processor may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the memory 204 may store instructions that, when executed by the processing resource, implement the processor 202. In such examples, the system 108 may comprise the memory 204 storing the instructions and the processing resource to execute the instructions, or the memory 204 may be separate but accessible to the system 108 and the processing resource. In other examples, the processor 202 may be implemented by electronic circuitry.
[0049] FIG. 3 illustrates an exemplary block diagram of an architecture for the system 108 of FIG. 2, according to one or more embodiments of the present invention. More specifically, FIG. 3 illustrates the system 108 configured for offset provisioning of records in a network 106. It is to be noted that the embodiment with respect to FIG. 3 will be explained with respect to the UE 102 for the purpose of description and illustration and should nowhere be construed as limited to the scope of the present disclosure. The FIG. 3 shows communication between the UE 102 and the system 108.
[0050] For the purpose of description of the exemplary embodiment as illustrated in FIG. 3, the User Equipment (UE) 102 uses network protocol connection to communicate with the system 108. In an embodiment, the network protocol connection is the establishment and management of communication between the UE 102 and the system 108 over the network 106 using a specific protocol or set of protocols. The network protocol connection includes, but not limited to, Session Initiation Protocol (SIP), System Information Block (SIB) protocol, Transmission Control Protocol (TCP), User Datagram Protocol (UDP), File Transfer Protocol (FTP), Hypertext Transfer Protocol (HTTP), Simple Network Management Protocol (SNMP), Internet Control Message Protocol (ICMP), Hypertext Transfer Protocol Secure (HTTPS) and Terminal Network (TELNET).
[0051] In an embodiment, the UE 102 includes a primary processor 302, a memory 304, and a user interface 306. In alternate embodiments, the UE 102 may include more than one primary processor 302 as per the requirement of the network 106. The primary processor 302, may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, single board computers, and/or any devices that manipulate signals based on operational instructions.
[0052] In an embodiment, the primary processor 302 is configured to fetch and execute computer-readable instructions stored in the memory 304. The memory 304 may be configured to store one or more computer-readable instructions or routines in a non-transitory computer-readable storage medium, which may be fetched and executed to create or share data packets over a network service. The memory 304 may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as disk memory, EPROMs, FLASH memory, unalterable memory, and the like.
[0053] In an embodiment, the user interface 306 of the UE 102 includes a variety of interfaces, for example, a graphical user interface, a web user interface, a Command Line Interface (CLI), and the like. The user interface module 306 is configured to allow a user to manage the offset number provisioning in records which includes data pertaining to the reports, dashboards, live report execution sessions and one or more report templates. The UE 102 transmits a first offset number in one or more report templates during the live report execution session via user interface 306.
[0054] In an embodiment, for example, initially the user interacts with the user interface 306 to provide the first offset number in one or more report templates. The user is required to specify the desired offset number, indicating the number of previous time intervals which needs to be excluded during a live report execution. During a live report execution, the load Balancer 220 distributes the incoming data pertaining to the first offset number across at least one computation engine 206 of the processor 202 to handle the processing load of the incoming data efficiently. In an alternate embodiment, an Integrated Performance Management (IPM) component is implemented linking the load balancer 220 and the computation engine 206. IPM component removes the boundaries associated with the one or more report templates in view of previous performance management frameworks and provides a holistic approach that addresses the entire spectrum of both operational and management processes in an integrated manner. The IPM is connected to the database 216 over the TCP protocol.
[0055] Further, during the live report execution, the computation engines 206 fetches the relevant data from the database 216 based on the one or more report template and the offset number provided by the user. The offset number is used to exclude a specific number of previous time intervals from the computation, ensuring that delayed data generation or late data transfer does not affect the accuracy of the live reports. The offset number ensures that the specific number of previous time intervals are not considered during the computation of the live reports. For example, if the offset number is set to 4 and the report is executed at a 15-minute interval, data from the 4 most recent 15 minute buckets will be excluded during the computation of the live reports.
[0056] After the execution of the live reports, the one or more report templates are sent to the AI/ML model 218 for analysis. The AI/ML model 218 performs the trend analysis and the pattern analysis pertaining to the one or more report templates and identifies the delayed time for each report of the one or more report templates. The AI/ML model 218 is trained on the delayed time for each report in every interval. The trend/pattern analysis facilitates to determine an updated optimal offset number for each report. By continuously optimizing the offset numbers, unnecessary higher offsets are avoided, ensuring accurate live report generation and in turn reducing the burden on the processor 202 for processing. In other words, due to the continuous optimization of the offset numbers, the miscalculations of the data included in the live reports are avoided which leads to reduction in the time consumption for generation of the accurate live reports in the peak events.
[0057] In one embodiment, the delayed time pertains to a time interval due to which the generation or the execution of the live reports is delayed. In particular, the accuracy of live reports may be affected due to the to delayed time. In order words, for each live report, when the data required to generate or execute the live reports is not received in a predefined time, then the time interval after the predefined time is referred as the delayed time.
[0058] In one embodiment, the AI/ML model 218 is trained on the delayed time for each report based on which the AI/ML model 218 performs the trend analysis and the pattern analysis pertaining to the one or more report templates and identifies the delayed time for each report of the one or more report templates. For example, during the live report execution session the user may predefine in the one or more report templates that while execution of the live report the user needs 1 hour of offset time in order to maintain the accuracy of live reports. While training, the AI/ML model 218 had learnt trend analysis/pattern related to the one or more report templates such as the maximum delayed time for each report is half hour, based on which the system 108 updates the offset time such as half hour so that the accuracy of the live reports is maintained. This way the system 108 takes care of an unnecessary higher offset time/number.
[0059] The updated optimal offset numbers which are determined by the system 108 utilizing the AI/ML model 218 based on the trend/pattern analysis are saved in the database 218. The updated offset numbers are utilized in one of the live report executions and for subsequent iterations of live report execution in which the previous offset numbers such as the first offset number provided by the user is replaced with the updated optimal offset number.
[0060] FIG. 4 is a flow diagram of a method 400 for offset provisioning of records in a network 106, according to one or more embodiments of the present invention. For the purpose of description, the method is described with the embodiments as illustrated in FIG. 2 and should nowhere be construed as limiting the scope of the present disclosure.
[0061] At step 402, the method 400 includes the step of fetching relevant data pertaining to the records from the database 216 based on the one or more report templates and the first offset number provided by the user during the live report execution session. In one embodiment, computation unit 206 of the processor 202 is configured to fetch relevant data pertaining to the records from the database 216 based on the one or more report templates and the first offset number provided by the user via the UE 102 during the live report execution session.
[0062] The offset number ensures that a specific number of previous time intervals are not considered during the computation of the live reports. For example, if the first offset number provided by the user is 4 and the report is executed for a 15-minute time interval, data from the 4 most recent 15 minute buckets will be excluded. In other words, for example, a network operator monitors live counter data or a KPI pertaining to a particular Network function (NF) in order to generate a live report. The generation of the live report might be delayed due to an unexpected event such as an error in the network 106 which leads to delay in receiving some part of the live data. Further, in certain situations, let us consider the current network operator is monitoring the live counter data at a time of 11:30 am for 15 minutes time interval. While monitoring the live counter data, let us consider the first offset number 4 is displayed on the dashboard of the current network operator. The first offset number 4 may be displayed for the current network operator since the first offset number 4 may be provided by a previous network operator. Based on the first offset number 4, the processor 202 will exclude the 4 buckets of the data which includes the live data received within each bucket of 15 minutes of time interval such as 11:30 am, 11:15 am, 11:00 am and 10:45 am.
[0063] At step 404, the method 400 includes the step of transmitting the one or more report templates to the trained model for analysis after the completion of the live report execution session. In one embodiment, the transceiver 208 of the processor 202 is configured to transmit the one or more report templates to the trained model for analysis.
[0064] At 406, the method 400 includes the step of determining utilizing the trained model, an updated second optimal offset number for each of the one or more report templates based on the analysis. The determination unit 210 of the processor 202 determines the updated second optimal offset number for each of the one or more report templates based on the analysis. The determination unit 210 performs the trend analysis and the pattern analysis pertaining to the one or more report templates and identifies the delayed time for each report of the one or more report templates based on the trend/pattern analysis. Based on the trend/pattern analysis, the determination unit 210 determines the relevant updated second optimal offset number for each report of the one or more report templates.
[0065] At 408, the method 400 includes the step of replacing the first offset number provided by the user with the updated second optimal offset number. The replacing unit 212 of the processor 202 is configured to replace the first offset number provided by the user with the updated second optimal offset number. The updated second optimal offset number is stored in the database 216 which will be utilized in subsequent iterations of the live report execution.
[0066] The method described above enables efficient and accurate live report execution by incorporating offset provisioning. By allowing users to provide offset number provisioning in one or more report templates, the invention ensures that delayed data generation or late data transfer does not affect the accuracy of live reports. Furthermore, the iterative analysis of the offset numbers using AI/ML model 218 enhances the system's 108 ability to adapt and optimize its performance over time.
[0067] FIG. 5 is an exemplary signal flow diagram for offset provisioning of records in the network, according to one or more embodiments of the present invention.
[0068] At step 502, the user transmits the first offset number included in the one or more report templates to the system 108 via the UI 306.
[0069] At step 504, the load balancer 220 of the system 108 forwards the first offset number included in the one or more report templates to the AI/ML model 218 subsequent to receiving the first offset number included in the one or more report templates from the user via the UI 306.
[0070] At step 506, the AI/ML model 218 stores the one or more report templates along with the first offset numbers for each of the one or more report templates in the database 216. In alternate embodiment, the processor 202 stores the one or more report templates along with the first offset numbering the database 216 utilizing the AI/ML model 218.
[0071] At step 508, the database 216 transmits a response pertaining to an acknowledgement regarding the storage of the one or more report templates along with the first offset numbers for each of the one or more report templates in the database 216 to the AI/ML model 218. In alternate embodiment, along with the AI/ML model 218, the database 216 transmits the response to the processor 202.
[0072] At step 510, the AI/ML model 218 transmits the response pertaining to the acknowledgement regarding the storage of the one or more report templates along with the first offset numbers in the database 216 to the user via the UI 306.
[0073] At step 512, the user transmits the live report execution request via the UI 306 to the system 108. In particular, the live report execution request pertains to monitoring of the live report related to the system 108.
[0074] At step 514, the load balancer 220 of the system 108 forwards the live report execution request to the AI/ML model 218 subsequent to receiving the live report execution request from the user.
[0075] At step 516, the AI/ML model 218 forwards the live report execution request to the processor 202 subsequent to receiving the live report execution request from the AI/ML model 218.
[0076] At step 518, based on the live report execution request, the processor 202 fetches the one or more report templates from the database 216. In particular, the processor 202 fetches the relevant data pertaining to the records of the one or more report templates. The relevant data may include at least one of but not limited to, the first offset number and KPIs.
[0077] At step 520, the processor 202 transmits the fetched one or more report templates to the AI/ML model 218. Further, the AI/ML model 218 performs at least one of, the trend analysis and pattern analysis pertaining to the fetched one or more report templates and identifies the delayed time for each report of the fetched one or more report templates. Furthermore, based on identified delayed time for each report of the fetched one or more report templates, the AI/ML model 218 determines an updated second optimal offset number for each of the fetched one or more report templates based on the based on the trend analysis and the pattern analysis.
[0078] At step 522, the AI/ML model 218 transmits the updated second optimal offset number to the processor 202. Further, the processor 202 replaces the first offset number provided by the user with the updated second optimal offset number.
[0079] At step 524, the processor 202 stores the replaced first offset number with the updated second optimal offset number in the database 216.
[0080] At step 526, the database 216 transmits a response pertaining to an acknowledgement regarding the storage of the replaced first offset number with the updated second optimal offset number in the database 216 to the processor 202.
[0081] At step 528, the processor 202 transmits the live report related to the system 108 with the updated second optimal offset number of the one or more report templates to the user. In particular, the processor 202 shows the live report related to the system 108 with the updated second optimal offset number to the user via the UI 306.
[0082] The present invention further discloses a non-transitory computer-readable medium having stored thereon computer-readable instructions. The computer-readable instructions are executed by the processor 202. The processor 202 is configured to fetch relevant data pertaining to the records from a database 216 based on one or more report templates and a first offset number provided by a user during a live report execution session. The processor 202 is further configured to transmit the one or more report templates to a trained model 218 for analysis after the completion of the live report execution session. The processor 202 is further configured to determine utilizing the trained model 218 an updated second optimal offset number for each of the one or more report templates based on the analysis. The processor 202 is further configured to replace the first offset number provided by the user with the updated second optimal offset number.
[0083] A person of ordinary skill in the art will readily ascertain that the illustrated embodiments and steps in description and drawings (FIG.1-5) are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0084] The present disclosure provides technical advancement for the offset provisioning which ensures that live report generation is not interrupted or delayed at peak events. The system 108 enables efficient management of the live report generation at the peak events. Each report is assigned a specific offset number, allowing live reports to be computed using the same set of resources. By employing suitable offset numbers, data loss is mitigated, preventing the population of miscalculated performance metrics.
[0085] The present invention offers multiple advantages over the prior art and the above listed are a few examples to emphasize on some of the advantageous features. The listed advantages are to be read in a non-limiting manner.

REFERENCE NUMERALS

[0086] Environment - 100;
[0087] User Equipment (UE) - 102;
[0088] Server - 104;
[0089] Network- 106;
[0090] System -108;
[0091] Processor - 202;
[0092] Memory - 204;
[0093] Computation engine – 206;
[0094] Transceiver– 208;
[0095] Determination unit – 210;
[0096] Replacing unit – 212;
[0097] Database – 216;
[0098] AI/ML model – 218;
[0099] Load balancer – 220;
[00100] Primary processor- 302;
[00101] Memory- 304;
[00102] User Interface – 306.


,CLAIMS:CLAIMS
We Claim:
1. A method (400) for offset provisioning of records, the method (400) comprising the steps of:
fetching, by one or more processors (202), relevant data pertaining to the records from a database (216) based on one or more report templates and a first offset number provided by a user during a live report execution session;
transmitting, by the one or more processors (202), the one or more report templates to a trained model (218) for analysis after the completion of the live report execution session;
determining, by the one or more processors (202), utilizing the trained model (218), an updated second optimal offset number for each of the one or more report templates based on the analysis; and
replacing, by the one or more processors (202), the first offset number provided by the user with the updated second optimal offset number.

2. The method (400) as claimed in claim 1, wherein the records include data of at least one of, reports, dashboards, live report execution sessions and report templates.

3. The method (400) as claimed in claim 1, wherein the one or more processor (202) by utilizing the offset numbers including the first offset number and the updated second optimal offset number ensures that the number of previous time intervals are excluded from report generation during the live report execution session.

4. The method (400) as claimed in claim 1, wherein the trained model (218) is at least one of, an Artificial Intelligence/Machine Learning (AI/ML) model.

5. The method (400) as claimed in claim 1, wherein the data pertaining to one or more report templates and updated second optimal offset number are stored in the database (216).

6. The method (400) as claimed in claim 1, wherein the step of, determining, by the one or more processors (202), utilizing the trained model (218), an updated second optimal offset number for each of the one or more report templates based on the analysis, includes the steps of:
performing, by the one or more processors (202), utilizing the trained model (218), a trend analysis and a pattern analysis pertaining to the one or more report templates;
identifying, by the one or more processors (202), a delayed time for each report of the one or more report templates based on the trend analysis and the pattern analysis; and
in response to identifying the delayed time for each report of the one or more report templates, determining, by the one or more processors (202), utilizing the trained model (218), the relevant updated second optimal offset number for each report of the one or more report templates based on the trend analysis and the pattern analysis.

7. The method (400) as claimed in claim 1, wherein the updated second optimal offset number is utilized in one of a live report execution and for an upcoming report execution.

8. The method (400) as claimed in claim 1, wherein the fetched relevant data is utilized by the one or more processors (202) for generating a live report.

9. A system (108) for offset provisioning of records, the system (108) comprising:
a computation engine (206), configured to, fetch, relevant data pertaining to the records from a database based on one or more report templates and a first offset number provided by a user during a live report execution session;
a transceiver (208), configured to, transmit, the one or more report templates to a trained model (204) for analysis after the completion of the live report execution session;
a determination unit (210), configured to, determine, utilizing the trained model, an updated second optimal offset number for each of the one or more report templates based on the analysis; and
a replacing unit (212), configured to, replace, the first offset number provided by the user with the updated second optimal offset number.

10. The system (108) as claimed in claim 9, wherein the records include data of at least one of, reports, dashboards, live report execution sessions and report templates.

11. The system (108) as claimed in claim 9, wherein by utilizing the offset numbers including the first offset number and the updated second optimal offset number the system (108) ensures that the number of previous time intervals are excluded from report generation during the live report execution session.

12. The system (108) as claimed in claim 9, wherein the trained model (218) is at least one of, an Artificial Intelligence/Machine Learning (AI/ML) model.

13. The system (108) as claimed in claim 9, wherein the data pertaining to one or more report templates and updated second optimal offset number are stored in the database (216).

14. The system (108) as claimed in claim 9, wherein the determination unit (210), determines, utilizing the trained model (218), an updated second optimal offset number for each of the one or more report templates based on the analysis, by:
performing, utilizing the trained model (218), a trend analysis and a pattern analysis pertaining to the one or more report templates;
identifying, a delayed time for each report of the one or more report templates based on the trend/pattern analysis; and
in response to identifying the delayed time for each report of the one or more report templates, determining, utilizing the trained model (218), the relevant updated second optimal offset number for each report of the one or more report templates based on trend/pattern analysis.

15. The system (108) as claimed in claim 9, wherein the updated second optimal offset number is utilized in live report execution and for an upcoming report execution.

16. The system (108) as claimed in claim 9, wherein the fetched relevant data is utilized by the computation engine (206) for generating a live report.

17. A User Equipment (UE) (102), comprising:
one or more primary processors (302) communicatively coupled to one or more processors (202), the one or more primary processors (302) coupled with a memory (304), wherein said memory (304) stores instructions which when executed by the one or more primary processors (304) causes the UE (102) to:
transmit, a first offset number in the one or more report templates during a live report execution session; and
wherein the one or more processors (202) is configured to perform the steps as claimed in claim 1.

Documents

Application Documents

# Name Date
1 202321048722-STATEMENT OF UNDERTAKING (FORM 3) [19-07-2023(online)].pdf 2023-07-19
2 202321048722-PROVISIONAL SPECIFICATION [19-07-2023(online)].pdf 2023-07-19
3 202321048722-FORM 1 [19-07-2023(online)].pdf 2023-07-19
4 202321048722-FIGURE OF ABSTRACT [19-07-2023(online)].pdf 2023-07-19
5 202321048722-DRAWINGS [19-07-2023(online)].pdf 2023-07-19
6 202321048722-DECLARATION OF INVENTORSHIP (FORM 5) [19-07-2023(online)].pdf 2023-07-19
7 202321048722-FORM-26 [03-10-2023(online)].pdf 2023-10-03
8 202321048722-Proof of Right [08-01-2024(online)].pdf 2024-01-08
9 202321048722-DRAWING [18-07-2024(online)].pdf 2024-07-18
10 202321048722-COMPLETE SPECIFICATION [18-07-2024(online)].pdf 2024-07-18
11 Abstract-1.jpg 2024-09-28
12 202321048722-Power of Attorney [05-11-2024(online)].pdf 2024-11-05
13 202321048722-Form 1 (Submitted on date of filing) [05-11-2024(online)].pdf 2024-11-05
14 202321048722-Covering Letter [05-11-2024(online)].pdf 2024-11-05
15 202321048722-CERTIFIED COPIES TRANSMISSION TO IB [05-11-2024(online)].pdf 2024-11-05
16 202321048722-FORM 3 [03-12-2024(online)].pdf 2024-12-03
17 202321048722-FORM 18 [20-03-2025(online)].pdf 2025-03-20