Sign In to Follow Application
View All Documents & Correspondence

Method And System For Generating Reports

Abstract: ABSTRACT METHOD AND SYSTEM FOR GENERATING REPORTS The present disclosure relates to a method for generating reports by one or more processors (202). The method includes monitoring historical behaviour of one or more users regarding report generation. Further, the method includes determining, using an Artificial Intelligence/Machine Learning (AI/ML) module (232), if the one or more users require report based on monitoring the historical behaviour of the one or more users. Further, the method includes pulling data from a database (214) to a reporting engine (224) based on detecting the one or more users require the report. Further, the method includes generating the report at the reporting engine (224) based on the data pulled from the database (214). Further, the method includes storing the generated report along with a first identifier at a caching layer (402). Ref. FIG. 5

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 July 2023
Publication Number
04/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

JIO PLATFORMS LIMITED
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India

Inventors

1. Aayush Bhatnagar
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
2. Ankit Murarka
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
3. Jugal Kishore Kolariya
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
4. Gaurav Kumar
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
5. Kishan Sahu
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
6. Rahul Verma
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
7. Sunil Meena
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
8. Gourav Gurbani
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
9. Sanjana Chaudhary
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
10. Chandra Kumar Ganveer
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
11. Supriya De
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
12. Kumar Debashish
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
13. Tilala Mehul
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
14. Yogesh Kumar
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
15. Kunal Telgote
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
16. Niharika Patnam
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
17. Avinash Kushwaha
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
18. Dharmendra Kumar Vishwakarma
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
19. Kalikivayi Srinath
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India
20. Vitap Pandey
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad - 380006, Gujarat, India

Specification

DESC:
FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003

COMPLETE SPECIFICATION
(See section 10 and rule 13)
1. TITLE OF THE INVENTION
METHOD AND SYSTEM FOR GENERATING REPORTS

2. APPLICANT(S)
NAME NATIONALITY ADDRESS
JIO PLATFORMS LIMITED INDIAN OFFICE-101, SAFFRON, NR. CENTRE POINT, PANCHWATI 5 RASTA, AMBAWADI, AHMEDABAD 380006, GUJARAT, INDIA
3.PREAMBLE TO THE DESCRIPTION

THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE NATURE OF THIS INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.

FIELD OF THE INVENTION
[0001] The present invention relates generally to database systems and management, and in particular, to a method and a system for generating reports from a database.
BACKGROUND OF THE INVENTION
[0002] A database such as a data lake is a centralized repository designed to store, process, and secure large amounts of structured, semi-structured, and unstructured data. The database can store data in its native format and process any variety of it, irrespective of the size. The data lakes can encompass hundreds of terabytes or even peta-bytes, storing replicated data from operational sources, including databases and the like.
[0003] A database report is a formatted presentation of data from the database that provides structured information for decision-making. The database reports are specifically meant to convey information in a way that is easy to understand by human beings.
[0004] Traditionally, a user logs into a system, provides inputs and then a report is generated as per the request. Examples of report inputs may be user interface elements such as drop-down lists, check boxes, or radio buttons that allow users to select values that filter report data. There have been efforts to automate report generation to automatically generate and share specific information, to selected people, on a pre-decided time interval. The automated reports can cover several operations areas but are often related to key performance indicators, financials, and other time-dependent information.
[0005] However, the specific criteria and the time frequency have to be fed into the system and are frozen for the entire life cycle of automation. The user may change the parameters as and when required but once set, the process of automation follows the set parameters with respect to the criteria for updating and frequency as fed by the user.
[0006] Further, such automated reports are generated at predetermined fixed intervals of time such as at fixed intervals of time, weekly, based upon when the updated data is expected and the like. Irrespective of the availability of the data, the automated report is generated at the predetermined scheduled time.
[0007] Further, in the case of scheduled report generation at the predetermined time or when a request is fired for a report by a user, the systems in the prior art are designed to retrieve data from the database and generate reports at those specific instance when request is made or when scheduled to do so. This over burdens the system resources and does not provide for uniform or optimum utilization of the resources.
[0008] Further, for huge reports comprising big data number crunching, the users may have to wait for several minutes before the report is executed, generated and fetched / delivered. This is not desired and avoidable. On top of that, another user may have generated a similar report an hour back. The systems in the prior art are not aware of the duplicity of the efforts and resources being deployed and the report is freshly generated every time a request comes in. This leads to wastage of system resources, which is avoidable.
[0009] It is desired that report generation from the database is made more efficient and faster, and optimises the use of network / system resources. There is therefore a need for a solution that overcomes the above challenges and provides for a system and method for generating reports from the database which is efficient, time- saving, faster and optimises use of system resources.
SUMMARY OF THE INVENTION
[0010] One or more embodiments of the present disclosure provide a system and a method for automatic report generation from a database.
[0011] In one aspect of the present invention, a method for generating reports is disclosed. The method includes monitoring, by one or more processors, historical behaviour of one or more users regarding report generation. Further, the method includes determining, by the one or more processors, using an Artificial Intelligence/Machine Learning (AI/ML) module, if the one or more users require report based on monitoring the historical behaviour of the one or more users. Further, the method includes pulling, by the one or more processors, data from a database to a reporting engine based on detecting the one or more users require the report. Further, the method includes generating, by the one or more processors, the report at the reporting engine based on the data pulled from the database. Further, the method includes storing, by the one or more processors, the generated report along with a first identifier at a caching layer.
[0012] In an embodiment, the step of monitoring further includes continuously adapting, by the one or more processor, the data pulled and reports generated based on ongoing user interactions.
[0013] In an embodiment, the step of monitoring further includes identifying, by the one or more processor, trends in report requests and adjusting the frequency of report generation in the caching layer accordingly.
[0014] In an embodiment, the step of generating the reports further includes adjusting, by the one or more processor, the frequency of report generation by prioritizing reports based on their historical frequency of use and execution time.
[0015] In an embodiment, the method further includes evicting, by the one or more processor, reports from the caching layer based on a decrease in user request frequency over a predetermined time.
[0016] In an embodiment, the method includes transmitting, by the one or more processors, the generated report stored at the caching layer to a user based on a user’s request.
[0017] In an embodiment, transmitting the generated report stored in the caching layer to a user based on the user’s request, includes the steps of: identifying, by the one or more processors, one or more attributes of a requested report based on the user’s request, generating, by the one or more processors, a second identifier based on the identified one or more attributes, and transmitting, by the one or more processors, the generated report to the user, if at least one of: the second identifier matches with the first identifier; or the one or more identified attributes match with the attributes of the generated report stored at the caching layer.
[0018] In an embodiment, transmitting the generated report stored in the caching layer to a user based on the user’s request, further includes the step of tagging, by the one or more processors, the requested report to the respective user, if at least one of, the second identifier does not match with the first identifier, or the one or more identified attributes does not match with the attributes of the generated report stored at the caching layer.
[0019] In an embodiment, monitoring the historical behaviour includes the frequency of report requests, execution time, and usage of the reports.
[0020] In another aspect of the present invention, a system for generating reports is disclosed. The system includes a monitoring unit, a reporting engine, a determination unit, an AI/ML module, a pulling unit, a reporting engine and a storing unit. The monitoring unit is configured to monitor historical behaviour of one or more users regarding report generation. The determination unit is configured to determine, using an Artificial Intelligence/Machine Learning (AI/ML) module, if the one or more users require report based on monitoring the historical behaviour of the one or more users. The pulling unit is configured to pull, data from a database to the reporting engine based on detecting the one or more users require the report. The reporting engine is configured to generate, the report based on the data pulled from the database. The storing unit is configured to store the generated report along with a first identifier at a caching layer.
[0021] In another aspect of the present invention, a non-transitory computer-readable medium having stored thereon computer-readable instructions that, when executed by a processor, causes the processor to monitor, historical behaviour of one or more users regarding report generation; predict, using an AI/ML module, if the one or more users require report based on monitoring the historical behaviour of the one or more users; pull, data from a database to a reporting engine based on detecting the one or more users require the report; generate, the report at the reporting engine based on the data pulled from the database; and store, the generated report along with a first identifier at a caching layer.
[0022] In another aspect of the present invention, a User Equipment (UE) includes one or more primary processors communicatively coupled to one or more processors of a system. The one or more primary processors are coupled with a memory, where a memory stores instructions which when executed by the one or more primary processors causes the UE to transmit, one or more requests from a user to the one or more processors for receiving one or more reports. The one or more processors is configured to perform the steps such as monitor, historical behaviour of one or more users regarding report generation, predict, using an AI/ML module, if the one or more users require report based on monitoring the historical behaviour of the one or more users, pull, data from a database to a reporting engine based on detecting the one or more users require the report, generate, the report at the reporting engine based on the data pulled from the database, and store, the generated report along with a first identifier at a caching layer.
[0023] Other features and aspects of this invention will be apparent from the following description and the accompanying drawings. The features and advantages described in this summary and in the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the relevant art, in view of the drawings, specification, and claims hereof. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes and may not have been selected to delineate or circumscribe the inventive subject matter, resort to the claims being necessary to determine such inventive subject matter.

BRIEF DESCRIPTION OF THE DRAWINGS
[0024] The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components, electronic components or circuitry commonly used to implement such components.
[0025] FIG. 1 is an exemplary block diagram of an environment for automatic report generation from a database, according to various embodiments of the present disclosure.
[0026] FIG. 2 is a block diagram of a system of FIG. 1, according to various embodiments of the present disclosure.
[0027] FIG. 3 is an example schematic representation of the system of FIG. 1 in which various entities operations are explained, according to various embodiments of the present system.
[0028] FIG. 4 is an example block diagram illustrating a system for report generation from the database, according to various embodiments of the present disclosure.
[0029] FIG. 5 shows a sequence flow diagram illustrating a method for automatic report generation from a database, according to various embodiments of the present disclosure.
[0030] FIG. 6 is an example flow diagram illustrating the method automatic report generation from the database, according to various embodiments of the present disclosure.
[0031] Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present invention. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
[0032] The foregoing shall be more apparent from the following detailed description of the invention.

DETAILED DESCRIPTION OF THE INVENTION
[0033] Some embodiments of the present disclosure, illustrating all its features, will now be discussed in detail. It must also be noted that as used herein and in the appended claims, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise.
[0034] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure including the definitions listed here below are not intended to be limited to the embodiments illustrated but is to be accorded the widest scope consistent with the principles and features described herein.
[0035] A person of ordinary skill in the art will readily ascertain that the illustrated steps detailed in the figures and here below are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0036] Before discussing example, embodiments in more detail, it is to be noted that the drawings are to be regarded as being schematic representations and elements that are not necessarily shown to scale. Rather, the various elements are represented such that their function and general purpose becomes apparent to a person skilled in the art. Any connection or coupling between functional blocks, devices, components, or other physical or functional units shown in the drawings or described herein may also be implemented by an indirect connection or coupling. A coupling between components may also be established over a wireless connection. Functional blocks may be implemented in hardware, firmware, software or a combination thereof.
[0037] Further, the flowcharts provided herein, describe the operations as sequential processes. Many of the operations may be performed in parallel, concurrently or simultaneously. In addition, the order of operations maybe re-arranged. The processes may be terminated when their operations are completed, but may also have additional steps not included in the figured. It should be noted, that in some alternative implementations, the functions/acts/ steps noted may occur out of the order noted in the figured. For example, two figures shown in succession may, in fact, be executed substantially concurrently, or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
[0038] Further, the terms first, second etc… may be used herein to describe various elements, components, regions, layers and/or sections, it should be understood that these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are used only to distinguish one element, component, region, layer or section from another region, layer, or a section. Thus, a first element, component, region layer, or section discussed below could be termed a second element, component, region, layer, or section without departing form the scope of the example embodiments.
[0039] Spatial and functional relationships between elements (for example, between modules) are described using various terms, including “connected,” “engaged,” “interfaced,” and “coupled.” Unless explicitly described as being “direct,” when a relationship between first and second elements is described in the description below, that relationship encompasses a direct relationship where no other intervening elements are present between the first and second elements, and also an indirect relationship where one or more intervening elements are present (either spatially or functionally) between the first and second elements. In contrast, when an element is referred to as being "directly” connected, engaged, interfaced, or coupled to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., "between," versus "directly between," "adjacent," versus "directly adjacent," etc.).
[0040] The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0041] As used herein, the singular forms “a,” “an,” and “the,” are intended to include the plural forms as well, unless the context clearly indicates otherwise. As used herein, the terms “and/or” and “at least one of” include any and all combinations of one or more of the associated listed items. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0042] Unless specifically stated otherwise, or as is apparent from the description, terms such as “processing” or “computing” or “calculating” or “determining” of “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device/hardware, that manipulates and transforms data represented as physical, electronic quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0043] Various embodiments of the invention provide a method for generating reports. The method includes monitoring, by one or more processors, historical behaviour of one or more users regarding report generation. Further, the method includes determining, by the one or more processors, using an AI/ML module, if the one or more users require report based on monitoring the historical behaviour of the one or more users. Further, the method includes pulling, by the one or more processors, data from a database to a reporting engine based on detecting the one or more users require the report. Further, the method includes generating, by the one or more processors, the report at the reporting engine based on the data pulled from the database. Further, the method includes storing, by the one or more processors, the generated report along with a first identifier at a caching layer.
[0044] The present disclosure is about determining and foreboding the requirement of specific reports based upon the historical user activity, interaction and behavior, and pre-computing and executing the report and generating and storing the same before-hand (that is even before a report request is actually received from a user) to a pre-executed-bucket list in a caching layer for ready-delivery. The pre-executed list is a dynamic list which is managed by the system and does not wait for user requests or inputs to create the pre-executed list.
[0045] The proposed method and system provides for monitoring the behavior of all users in the system with regard to reports being requested and generated. Based upon the historical data of such report generation behavior of the users in the system, the proposed method and system provide for pre-executed reports in an efficient, time- saving, and a faster manner. The system optimizes use of system resources.
[0046] In an embodiment, the method and system can be used for pre-executing and generating specific reports based upon the historical user activity regarding frequency of report request and size of the report and storing the report before-hand (that is even before a report request is actually received from a user) to a pre-executed-bucket list in a caching layer. An AI / ML module (being continuously trained) monitors user behavior, frequency of report request and the execution time required / size of the report and automatically pulls the data from the database to the caching layer. The reporting engine generates the report and stores it in the caching layer for ready delivery irrespective of user request. When the user requests for the report, the report is already present in the caching layer, so delivery is faster. The proposed method and system thus provides optimisation of network and computational resources, faster report generation, and saves time and bandwidth.
[0047] FIG. 1 illustrates an exemplary block diagram of an environment (100) for automatic report generation from a database (214), according to various embodiments of the present disclosure. The environment (100) comprises a plurality of user equipment’s (UEs) 102-1, 102-2, ……,102-n. The at least one UE (102-n) from the plurality of the UEs (102-1, 102-2, ……102-n) is configured to connect to a system (108) via the communication network (106). Hereafter, label for the plurality of UEs or one or more UEs is 102.
[0048] In accordance with yet another aspect of the exemplary embodiment, the plurality of UEs (102) may be a wireless device or a communication device that may be a part of the system (108). The wireless device or the UE (102) may include, but are not limited to, a handheld wireless communication device (e.g., a mobile phone, a smart phone, a phablet device, and so on), a wearable computer device (e.g., a head-mounted display computer device, a head-mounted camera device, a wristwatch computer device, and so on), a laptop computer, a tablet computer, or another type of portable computer, a media playing device, a portable gaming system, and/or any other type of computer device with wireless communication or VoIP capabilities. In an embodiment, the UEs may include, but are not limited to, any electrical, electronic, electro-mechanical or an equipment or a combination of one or more of the above devices such as virtual reality (VR) devices, augmented reality (AR) devices, laptop, a general-purpose computer, desktop, personal digital assistant, tablet computer, mainframe computer, or any other computing device, wherein the computing device may include one or more in-built or externally coupled accessories including, but not limited to, a visual aid device such as camera, audio aid, a microphone, a keyboard, input devices for receiving input from a user such as touch pad, touch enabled screen, electronic pen and the like. It may be appreciated that the UEs may not be restricted to the mentioned devices and various other devices may be used. A person skilled in the art will appreciate that the plurality of UEs (102) may include a fixed landline, a landline with assigned extension within the communication network (106).
[0049] The plurality of UEs (102) may comprise a memory such as a volatile memory (e.g., RAM), a non-volatile memory (e.g., disk memory, FLASH memory, EPROMs, etc.), an unalterable memory, and/or other types of memory. In one implementation, the memory might be configured or designed to store data. The data may pertain to attributes and access rights specifically defined for the plurality of UEs (102). The UE (102) may be accessed by the user, to receive the requests related to an order determined by the system (108). The communication network (106), may use one or more communication interfaces/protocols such as, for example, Voice Over Internet Protocol (VoIP), 802.11 (Wi-Fi), 802.15 (including Bluetooth™), 802.16 (Wi-Max), 802.22, Cellular standards such as Code Division Multiple Access (CDMA), CDMA2000, Wideband CDMA (WCDMA), Radio Frequency Identification (e.g., RFID), Infrared, laser, Near Field Magnetics, etc.
[0050] The system (108) is communicatively coupled to a server (104) via the communication network (106). The server (104) can be, for example, but not limited to a standalone server, a server blade, a server rack, an application server, a bank of servers, a business telephony application server (BTAS), a server farm, a cloud server, an edge server, home server, a virtualized server, one or more processors executing code to function as a server, or the like. In an implementation, the server (104) may operate at various entities or a single entity (include, but is not limited to, a vendor side, a service provider side, a network operator side, a company side, an organization side, a university side, a lab facility side, a business enterprise side, a defence facility side, or any other facility) that provides service.
[0051] The communication network (106) includes, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, or some combination thereof. The communication network (106) may include, but is not limited to, a Third Generation (3G), a Fourth Generation (4G), a Fifth Generation (5G), a Sixth Generation (6G), a New Radio (NR), a Narrow Band Internet of Things (NB-IoT), an Open Radio Access Network (O-RAN), and the like.
[0052] The communication network (106) may also include, by way of example but not limitation, at least a portion of one or more networks having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, waves, voltage or current levels, some combination thereof, or so forth. The communication network (106) may also include, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, a VOIP or some combination thereof.
[0053] One or more network elements can be, for example, but not limited to a base station that is located in the fixed or stationary part of the communication network (106). The base station may correspond to a remote radio head, a transmission point, an access point or access node, a macro cell, a small cell, a micro cell, a femto cell, a metro cell. The base station enables transmission of radio signals to the UE or mobile transceiver. Such a radio signal may comply with radio signals as, for example, standardized by a 3GPP or, generally, in line with one or more of the above listed systems. Thus, a base station may correspond to a NodeB, an eNodeB, a Base Transceiver Station (BTS), an access point, a remote radio head, a transmission point, which may be further divided into a remote unit and a central unit.
[0054] 3GPP: The term “3GPP” is a 3rd Generation Partnership Project and is a collaborative project between a group of telecommunications associations with the initial goal of developing globally applicable specifications for Third Generation (3G) mobile systems. The 3GPP specifications cover cellular telecommunications technologies, including radio access, core network, and service capabilities, which provide a complete system description for mobile telecommunications. The 3GPP specifications also provide hooks for non-radio access to the core network, and for networking with non-3GPP networks.
[0055] The system (108) may include one or more processors (202) coupled with a memory (204), wherein the memory (204) may store instructions which when executed by the one or more processors (202) may cause the system (108) executing requests in the communication network (106) or the server (104). An exemplary representation of the system (108) for such purpose, in accordance with embodiments of the present disclosure, is shown in FIG. 2 as system (108). In an embodiment, the system (108) may include one or more processor(s) (202). The one or more processor(s) (202) may be implemented as one or more microprocessors, microcomputers, microcontrollers, edge or fog microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that process data based on operational instructions. Among other capabilities, the one or more processor(s) (202) may be configured to fetch and execute computer-readable instructions stored in the memory (204) of the system (108). The memory (204) may be configured to store one or more computer-readable instructions or routines in a non-transitory computer readable storage medium, which may be fetched and executed to create or share data packets over a network service.
[0056] The environment (100) further includes the system (108) communicably coupled to the remote server (104) and each UE of the plurality of UEs (102) via the communication network (106). The remote server (104) is configured to execute the requests in the communication network (106).
[0057] The system (108) is adapted to be embedded within the remote server (104) or is embedded as the individual entity. The system (108) is designed to provide a centralized and unified view of data and facilitate efficient business operations. The system (108) is authorized to access to update/create/delete one or more parameters of their relationship between the requests for the workflow, which gets reflected in real-time independent of the complexity of network.
[0058] In another embodiment, the system (108) may include an enterprise provisioning server (for example), which may connect with the remote server (104). The enterprise provisioning server provides flexibility for enterprises, ecommerce, finance to update/create/delete information related to the requests in real time as per their business needs. A user with administrator rights can access and retrieve the requests for the workflow and perform real-time analysis in the system (108).
[0059] The system (108) may include, by way of example but not limitation, one or more of a standalone server, a server blade, a server rack, a bank of servers, a business telephony application server (BTAS), a server farm, hardware supporting a part of a cloud service or system, a home server, hardware running a virtualized server, one or more processors executing code to function as a server, one or more machines performing server-side functionality as described herein, at least a portion of any of the above, some combination thereof. In an implementation, system (108) may operate at various entities or single entity (for example include, but is not limited to, a vendor side, service provider side, a network operator side, a company side, an organization side, a university side, a lab facility side, a business enterprise side, ecommerce side, finance side, a defence facility side, or any other facility) that provides service.
[0060] However, for the purpose of description, the system (108) is described as an integral part of the remote server (104), without deviating from the scope of the present disclosure. Operational and construction features of the system (108) will be explained in detail with respect to the following figures.
[0061] FIG. 2 illustrates a block diagram of the system (108) provided for automatic report generation from the database (214) (e.g., centralized database, data lake), according to one or more embodiments of the present invention. As per the illustrated embodiment, the system (108) includes the one or more processors (202), the memory (204), an input/output interface unit (206), a display (208), an input device (210), and the database (214). Further the system (108) may comprise one or more processors (202). The one or more processors (202), hereinafter referred to as the processor (202) may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, single board computers, and/or any devices that manipulate signals based on operational instructions. As per the illustrated embodiment, the system (108) includes one processor. However, it is to be noted that the system (108) may include multiple processors as per the requirement and without deviating from the scope of the present disclosure.
[0062] The information related to the request may be provided or stored in the memory (204) of the system (108). Among other capabilities, the processor (202) is configured to fetch and execute computer-readable instructions stored in the memory (204). The memory (204) may be configured to store one or more computer-readable instructions or routines in a non-transitory computer-readable storage medium, which may be fetched and executed to create or share data packets over a network service. The memory (204) may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as disk memory, EPROMs, FLASH memory, unalterable memory, and the like.
[0063] The memory (204) may comprise any non-transitory storage device including, for example, volatile memory such as Random-Access Memory (RAM), or non-volatile memory such as Electrically Erasable Programmable Read-only Memory (EPROM), flash memory, and the like. In an embodiment, the system (108) may include an interface(s). The interface(s) may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as input/output (I/O) devices, storage devices, and the like. The interface(s) may facilitate communication for the system. The interface(s) may also provide a communication pathway for one or more components of the system. Examples of such components include, but are not limited to, processing unit/engine(s) and a database. The processing unit/engine(s) may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s).
[0064] The information related to the requests may further be configured to render on the user interface (206). The user interface (206) may include functionality similar to at least a portion of functionality implemented by one or more computer system interfaces such as those described herein and/or generally known to one having ordinary skill in the art. The user interface (206) may be rendered on the display (208), implemented using Liquid Crystal Display (LCD) display technology, Organic Light-Emitting Diode (OLED) display technology, and/or other types of conventional display technology. The display (208) may be integrated within the system (108) or connected externally. Further the input device(s) (210) may include, but not limited to, keyboard, buttons, scroll wheels, cursors, touchscreen sensors, audio command interfaces, magnetic strip reader, optical scanner, etc.
[0065] The database (214) may be communicably connected to the processor (202) and the memory (204). The database (214) may be configured to store and retrieve the request pertaining to features, or services or workflow of the system (108), access rights, attributes, approved list, and authentication data provided by an administrator. Further the remote server (104) may allow the system (108) to update/create/delete one or more parameters of their information related to the request, which provides flexibility to roll out multiple variants of the request as per business needs. In another embodiment, the database (214) may be outside the system (108) and communicated through a wired medium and wireless medium.
[0066] Further, the processor (202), in an embodiment, may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor (202). In the examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor (202) may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processor (202) may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the memory (204) may store instructions that, when executed by the processing resource, implement the processor (202). In such examples, the system (108) may comprise the memory (204) storing the instructions and the processing resource to execute the instructions, or the memory (204) may be separate but accessible to the system (108) and the processing resource. In other examples, the processor (202) may be implemented by an electronic circuitry.
[0067] In order for the system (108) to automatic report generation from the database (214), the processor (202) includes a monitoring unit (216), a reporting engine (224), a determination unit (218), an AI/ML module (232), a pulling unit (220), a storing unit (226), an evicting unit (228), and a transmitting unit (230). The monitoring unit (216), the reporting engine (224), the determination unit (218), the AI/ML module (232), the pulling unit (220), the storing unit (226), the evicting unit (228), and the transmitting unit (230) may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor (202). In the examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor (202) may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processor (202) may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the memory (204) may store instructions that, when executed by the processing resource, implement the processor. In such examples, the system (108) may comprise the memory (204) storing the instructions and the processing resource to execute the instructions, or the memory (204) may be separate but accessible to the system (108) and the processing resource. In other examples, the processor (202) may be implemented by the electronic circuitry.
[0068] In order for the system (108) automatic report generation from the database (214), the monitoring unit (216), the reporting engine (224), the determination unit (218), the AI/ML module (232), the pulling unit (220), the storing unit (226), the evicting unit (228), and the transmitting unit (230) are communicably coupled to each other. In an example embodiment, the monitoring unit (216) monitors historical behaviour of one or more users regarding report generation. The historical behaviour includes the frequency of report requests, execution time, and usage of the reports. In an embodiment, the monitoring unit (216) monitors historical behaviour of the one or more users regarding report generation by identifying, trends in report requests and adjusting the frequency of report generation in a caching layer (402) (as shown in FIG. 4) accordingly. In an embodiment, the monitoring unit (216) continuously adapts the data pulled and reports generated based on ongoing user interactions. The data can be, for example, but not limited to be KPI and performance counters. The data can also be, for example, but not limited to a product name, a sales amount, a customer name, a date of purchase, a product category, a customer segment, a time period, a business conversion rate, a location, a purchase history or the like. In an example, based on the user interaction, the AI/ML module (232) improves its understanding of user preferences and requirements. The user interaction can be for example, but not limited to the user’s behaviour to specific reports and user’s usage pertaining to the report. The user’s usage pertains to frequency of usage of the report, time interval to view the report, attributes selected in the report, a report template, user’s schedule for monitoring the report or the like. Based on the ongoing interactions, the AI/ML module (232) can dynamically adjust the content and format of the reports it generates. This ensures that the reports become more personalized and aligned with the user’s needs over time. The ongoing interactions contribute to iterative improvement of the AI/ML module (232) used for report generation. This may involve retraining the AL/ML models based on the data gathered from the user interactions to enhance prediction accuracy and relevance. The user interactions also help the AI/ML module (232) adapt to changing business requirements or the user preferences, ensuring that the reports remain useful and valuable.
[0069] The determination unit (218) determines, using the AI/ML module (232), if the one or more users require report based on monitoring the historical behaviour of the one or more users. In an embodiment, the determination unit (218) has a capability of an AI/ML model and prediction. The report is specifically meant to convey information in a way that is easy to understand by human beings. The report can be, for example, but not limited to a business report, forecast reports, a key performance indicator (KPI) report, or the like. In other words, the report is comparable to those used by many domains and is based on changing data from a number of sources. In many cases, the data for the report are in a hierarchical structure. This structure may be organizational, geographic, chronological, or similar. Training the AI/ML module (232) includes data collection, data preprocessing, feature engineering, model selection, and training process. The data collection refers to the process of gathering and assembling relevant information or data from various sources. The data collection involves several steps, including identifying relevant sources, extracting data from the sources, cleaning and preprocessing the data to ensure quality and consistency, and organizing it in a format suitable for analysis and modeling. The data preprocessing includes cleaning the data, handling missing values, normalizing numerical features, and encoding categorical variables. The feature engineering includes extracting relevant features such as trends over time, seasonal patterns, customer segmentation based on purchasing behaviour, and external factors. The model selection includes choosing an appropriate AI/ML model for forecasting, such as a time series forecasting model (e.g., ARIMA,) or machine learning algorithms (e.g., Random Forest, Gradient Boosting). The training process includes training the selected model using prepared historical data. The AI/ML model learns patterns and relationships within the data to predict future output accurately. In an example, the AI/ML module (232) can be trained with historical data to predict future outcomes (e.g., sales requirements, business requirements, or the like) and output actionable reports that assist decision-making processes in the user requirement or a business requirement. Further, the pulling unit (220) pulls the data from the database to the reporting engine (224) based on detecting the one or more users require the report.
[0070] The reporting engine (224) generates the report based on the data pulled from the database (214). In an embodiment, the reporting engine (224)generates the reports by adjusting the frequency of report generation by prioritizing reports based on their historical frequency of use and execution time.
[0071] The storing unit (226) stores the generated report along with a first identifier at the caching layer (402). Further, the evicting unit (228) evicts the reports from the caching layer (402) based on a decrease in user request frequency over a predetermined time
[0072] Further, the transmitting unit (230) transmits the generated report stored at the caching layer (402) to the user based on the user’s request.
[0073] In an embodiment, the transmitting unit (230) identifies the one or more attributes of a requested report based on the user’s request. The attributes typically refer to characteristics or properties of the data that are relevant to the report being generated. In an exemplary embodiment, the attribute is one of, but not limited to data fields, dimensions, filters and groupings, metrics or the like. The attributes can represent different dimensions of the data. In another embodiment, the attributes is at least one of, but not limited to, a signal strength, a network latency, call quality, and connectivity status. The attribute plays a crucial role in data analysis and report generation by providing the basis for segmentation, comparison, and statistical calculations. The attribute is often used to group and summarize data for insights and decision-making while generating the report. Further, the transmitting unit (230) generates a second identifier based on the identified one or more attributes. Further, the transmitting unit (230) transmits the generated report to the user, if at least one of: the second identifier matches with the first identifier, or the one or more identified attributes match with the attributes of the generated report stored at the caching layer (402).
[0074] Further, the transmitting unit (230) tags the requested report to the respective user, if at least one of, the second identifier does not match with the first identifier, or the one or more identified attributes does not match with the attributes of the generated report stored at the caching layer (402).
[0075] The example for report generation from the database (214) is explained in FIG. 4.
[0076] FIG. 3 is an example schematic representation of the system (300) of FIG. 1 in which various entities operations are explained, according to various embodiments of the present system. It is to be noted that the embodiment with respect to FIG. 3 will be explained with respect to the first UE (102-1) and the system (108) for the purpose of description and illustration and should nowhere be construed as limited to the scope of the present disclosure.
[0077] As mentioned earlier, the first UE (102-1) includes one or more primary processors (305) communicably coupled to the one or more processors (202) of the system (108). The one or more primary processors (305) are coupled with a memory (310) storing instructions which are executed by the one or more primary processors (305). Execution of the stored instructions by the one or more primary processors (305) enables the UE (102-1). The execution of the stored instructions by the one or more primary processors (305) further enables the UE (102-1) to execute the requests in the communication network (106).
[0078] As mentioned earlier, the one or more processors (202) is configured to transmit a response content related to the request to the UE (102-1). More specifically, the one or more processors (202) of the system (108) is configured to transmit the response content from a kernel (315) to at least one of the UE (102-1). The kernel (315) is a core component serving as the primary interface between hardware components of the UE (102-1) and the system (108). The kernel (315) is configured to provide the plurality of response contents hosted on the system (108) to access resources available in the communication network (106). The resources include one of a Central Processing Unit (CPU), memory components such as Random Access Memory (RAM) and Read Only Memory (ROM).
[0079] As per the illustrated embodiment, the system (108) includes the one or more processors (202), the memory (204), the input/output interface unit (206), the display (208), and the input device (210). The operations and functions of the one or more processors (202), the memory (204), the input/output interface unit (206), the display (208), and the input device (210) are already explained in FIG. 2. For the sake of brevity, we are not explaining the same operations (or repeated information) in the patent disclosure. Further, the processor (202) includes the monitoring unit (216), the reporting engine (224), the determination unit (218), the AI/ML module (232), the pulling unit (220), the storing unit (226), the evicting unit (228), and the transmitting unit (230). The operations and functions of the monitoring unit (216), the reporting engine (224), the determination unit (218), the AI/ML module (232), the pulling unit (220), the storing unit (226), the evicting unit (228), and the transmitting unit (230) are already explained in FIG. 2. For the sake of brevity, we are not explaining the same operations (or repeated information) in the patent disclosure.
[0080] FIG. 4 is an example block diagram illustrating an example system (400) for report generation from the database (214), according to various embodiments of the present disclosure.
[0081] In an embodiment, the IPM (404) is configured to maintain the KPI and counters collected from the sources at the database (214). At step 1, for the first time, the user sends a first time request to the user interface (206).
[0082] Once the user interface (206) receives the first time request from the user, at step 2, the user interface (206) forwards or shares the first time request to the IPM (404). Thereafter, at step 3, the IPM (404) sends the first time request to the reporting engine (224). At step 4, the reporting engine (224) is configured to pull the data from the database (214). Once the data is pulled from the database (214), at step 5, the reporting engine (224) generates the report and sends the report to the IPM (404). At step 6, the IPM (404) forwards the report to the user interface (206). At step 7, the IPM (404) also stores the report at the caching layer (402). At step 8, the caching layer (402) sends the report to the AI/ML module (232).
[0083] For a repeat request sent by the user, (or the second time similar request) which is received at the user interface 206, the user’s activity and the report usage pattern are monitored by the AI/ML module (232) at step 9. Upon monitoring, the AI/ML module (232) informs the reporting engine (224) to generate the report at step 10. At step 11, the reporting engine (224) pulls data from the database (214), and then the reporting engine (224) forwards the generated pre-computed report to the caching layer (402). At step 12, when the repeat request is received at the user interface (206), the identifier included in the repeat request is matched with the pre-computed report at the caching layer (402). If the identifier included in the repeat request is matched with the pre-computed report then, at step 13, the report is retrieved from the caching layer (402)and presented to the user through the user interface (206). At step 14, the report/usage of the report is provided to the AI/ML module (232). If the identifier included in the repeat request is not matched with the pre-computed report then, the system (400) follows the first time request process as explained above.
[0084] In an example, the database (214) stores all the data related to the applications (e.g., users, report, or the like) etc. in the network (106). The reporting engine (224) generates the report when initiated to do so, by using the AI / ML module (232) and the caching layer (402). The system (400) also includes an Integrated Performance Management (IPM) (404). The IPM (404) is a network management application that enables monitoring the performance of multi-protocol networks. The IPM (404) measures the response time and availability of IP networks. The IPM (404) receives performance data from all network elements, analyses and creates and provides the report for reviewing system key performance indicator (KPI). The IPM (404) also highlights if there is any breach of the KPIs as per the policy. The IPM (404) maintains performance counters and KPIs of network elements (e.g., base station, switch, router, eNB, gNB or the like) of the communication network (106). The IPM (404) is responsible to provide an interface to operations agents to monitor the performance of every node using performance counters, to get details from various subsystems which helps in analysing a root cause whenever service outage occurs, and to provide node wise detailed information in near real time whenever any discrepancy occurs in underlying network. The IPM (404) provides an interface for designing node wise KPIs as per the operations requirement and to execute on demand KPIs whenever triggers are received. The node and performance monitor (PM) data Create, Read, Update, and Delete (CRUD) for monitoring performance of every node is supported using the PM and KPI across different versions while a KPI CRUD supports logical and arithmetical KPIs and KPIs of KPI. Policies are provisioned to highlight the breaching of KPIs/ counter thresholds occur by triggering notifications via emails/ Short Message Service (SMS). The IPM (404) also performs an evaluation for logical KPIs for true/ false scenarios and values to be filled up along with report template CRUD involving the counter, KPI and mix type with aggregation followed by the report execution. There is also a scheduled report execution with download, view and roll up/ drill down. The dashboards have a roll up/ drill down feature with colour coding while the graphical dashboards have an on demand colour coding and hierarchical PM/KPIS visualization along with a static network area (SNA).
[0085] The AI / ML module (232) automatically pulls the data required for report generation from the database (214) to the caching layer (402). The caching layer (402) provides the data to the reporting engine (224). The reporting engine (224) will generate the report when initiated by the system (400). The generated report is kept / stored in the caching layer (402). Hence, when the user fires a request for report generation, because of the system (400) and method of the invention, the report is already present in the caching layer (402) and the report fetching (and generation) is faster.
[0086] Further, a number of users in the system (400) may want to fetch the same or substantially the same report at different instances. It is identified whether such reports being requested have already been generated or if it is a new report. The proposed method and system (400) provides for storing such frequently generated and accessed reports in the caching layer (402) for instant retrieval and fetching, thereby obviating the need for the whole process of data retrieval from the database (214) into the caching layer (402), calculations, etc., thus saving on time and resources.
[0087] At the instance of a new request for the report by the user, the system (400) compares the inputs accompanying the request and determines if the request is for the new report, or an already generated report may serve the request and save on time and resources by obviating the duplicity of efforts in generating an already available report.
[0088] In an embodiment, at the instance of the new request for the report by the user, based upon the inputs provided by the user accompanying the new request, the system (400) determines if the request is for the new report or an already generated report. If the request is for the new report which has not been automatically already generated and stored in the caching layer (402), the new report request is triggered and this new report is tagged with the name of this user. The AI / ML module (232) is continuously learning and training based upon such activities and the automated processes in system are continuously updating as per the real time user behavior / interaction with the system.
[0089] In an embodiment, the system (400) provides for continuously adapting based upon the historical frequency of execution / request of a specific report by users, the time required for generation of the report (execution time) and the number of times the report is being actually used / retrieved from the system (400). For example, the higher the frequency of a specific report being generated and used, the higher the priority (with respect to both frequency and resources required) that is being assigned to that specific report by the invention. The method and system (400) gives higher priority in terms of frequency of updating, generating and assigning system / network resources for generation and adding of that specific report in to the pre-executed bucket. It may be appreciated that the system (400) provides for automated foreboding of report requirement based upon actual human user behavior deciphered from the historical data available.
[0090] For example, if a report generation takes three minutes which is rather high as compared to an average report generation time, say ten seconds, that specific report will be given priority for execution and adding into the pre-executed bucket list since for those three minutes there will be substantial load on the entire band width of the network. The system (400) will process such reports while optimizing system resources even before an actual request by a user comes in and store the same in the caching layer (402) so that when the request actually comes in, the delivery is almost instant and in real time.
[0091] Conversely, the system (400) further enables optimum usage of computing resources by reducing the priority in terms of resource allocation and frequency of generating that specific report for which the recent usage trend is low, i.e., which is not being used frequently recently in terms of retrieval / request / generation by the actual users. The frequency of updating, generating and assigning system / network resources for generation and adding of such low trend specific reports into the pre-executed bucket is reduced automatically. The system (400) continuously monitors such usage and automatically reduces the priority for such reports. The frequency at which such priorities are shifted / changed may be configurable.
[0092] Based upon the usage frequency, if a certain report which was being used frequently earlier is not getting triggered by the actual user for a predetermined duration of time (say a few weeks or a month), the AI / ML module (232) will instruct the caching layer (402) to start evicting those specific reports for which the usage is reducing with respect to frequency of retrieval / trigger by the actual user. It may be apparent that the reference to actual user report generation means when a user requests a report. Such report may not be found in the caching layer (402) and may be a new kind of report or an old report which was evicted from the caching layer (402) because of trending low. In an embodiment, this actual report generation is monitored by the system (400) and based upon frequency and size / time required for generation of this new report, the system (400) provides for pre-execution and storing of the report.
[0093] In an embodiment, the time required to execute and generate a specific report may determine if the report needs to be evicted from the system or pre-executed and added to the pre-executed bucket list or not. For example, a specific small report which requires about less than a second to about few seconds for generation and execution may not be required to be added to the pre-executed bucket and may in turn not be a part of the method and system of the invention. The system (400) identifies such small reports and does not make them a part of the method and system (400) of pre-executing to achieve the balance of benefits of the invention
[0094] FIG. 5 is a flow chart (500) illustrating a method automatic report generation from the database (214), according to various embodiments of the present system.
[0095] At step 502, the method includes monitoring the historical behaviour of one or more users regarding report generation. At step 504, the method includes determining, using the AI/ML module (232), if the one or more users require report based on monitoring the historical behaviour of the one or more users. At step 506, the method includes pulling data from the database (214) to the reporting engine (224) based on detecting the one or more users require the report.
[0096] At step 508, the method includes generating the report at the reporting engine (224) based on the data pulled from the database. At step 510, the method includes storing the generated report along with the first identifier at the caching layer (402).
[0097] FIG. 6 is an example flow diagram illustrating the method automatic report generation from the database (214), according to various embodiments of the present disclosure.
[0098] At step 602, the user provides inputs and parameters accompanied by the report generation request. At step 604, the system (400) creates the internal ID based upon inputs. At step 606, at the instance of a new request for a report by the user, the system (400), based upon the inputs provided by the user accompanying the new request, compares with all the reports already available in the caching layer (402) and determines if the request is for a new report or an already generated report.
[0099] At step 608, in case of a match, the already present report serves the request. This saves time, resources and provides instant / faster delivery to the user with no wait time even for huge report sizes.
[00100] At step 610, on the other hand, if no match is found, a new report is triggered for execution (data fetching, generation, etc.) as per the standard known processes. In accordance with an embodiment, this new report is tagged with the name of this user. At step 612, depending upon at least one of: the user behavior / interaction with the system such as this new report request, the frequency at which a specific kind of report is requested and the size of the report, the AI / ML model is continuously learning and training and pre-executing and storing reports based upon such activities in the caching layer (402) for quick delivery and thereby obviating the wait and duplicity of efforts.
[00101] Below is the technical advancement of the present invention:
[00102] As such, the above techniques of the present disclosure provide multiple advantages including optimisation of network and computational resources, faster report generation, and saves both time and bandwidth. The system (400) provides faster and intelligent user-driven reports that are pre-computed based upon user interaction / requirement / activity.
[00103] A method and system for pre-executing and generating specific reports based upon the historical user activity regarding frequency of report request and size of the report and storing the report before-hand (that is even before a report request is actually received from a user) to a pre-executed-bucket list in the caching layer (402). The AI / ML module (232) (being continuously trained) monitors user behavior, frequency of report request and the execution time required / size of the report and automatically pulls the data from the database (214) to the caching layer (402). The reporting engine (224) generates the report and stores it in the caching layer (402) for ready delivery irrespective of user request. When the user requests for the report, the report is already present in the caching layer (402), so delivery is faster. The system (400) thus provides optimisation of network and computational resources, faster report generation, and saves time and bandwidth.
[00104] A person of ordinary skill in the art will readily ascertain that the illustrated embodiments and steps in description and drawings (FIGS. 1-6) are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[00105] Method steps: A person of ordinary skill in the art will readily ascertain that the illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[00106] The present invention offers multiple advantages over the prior art and the above listed are a few examples to emphasize on some of the advantageous features. The listed advantages are to be read in a non-limiting manner.

REFERENCE NUMERALS
[00107] Environment - 100
[00108] UEs– 102, 102-1-102-n
[00109] Server - 104
[00110] Communication network – 106
[00111] System – 108
[00112] Processor – 202
[00113] Memory – 204
[00114] User Interface – 206
[00115] Display – 208
[00116] Input device – 210
[00117] Database – 214
[00118] Monitoring unit– 216
[00119] Determination unit – 218
[00120] Pulling unit – 220
[00121] Reporting engine – 224
[00122] Storing unit – 226
[00123] Evicting unit – 228
[00124] Transmitting unit – 230
[00125] AI/ML module - 232
[00126] System - 300
[00127] Primary processors -305
[00128] Memory– 310
[00129] Kernel– 315
[00130] Example system - 400
[00131] Caching layer – 402
[00132] IPM – 404
,CLAIMS:CLAIMS:
We Claim
1. A method for generating reports, the method comprising the steps of:
monitoring, by one or more processors (202), historical behaviour of one or more users regarding report generation;
determining, by the one or more processors (202), using an Artificial Intelligence/Machine Learning (AI/ML) module (232), if the one or more users require report based on monitoring the historical behaviour of the one or more users;
pulling, by the one or more processors (202), data from a database (214) to a reporting engine (224) based on detecting the one or more users require the report;
generating, by the one or more processors (202), the report at the reporting engine (224) based on the data pulled from the database (214); and
storing, by the one or more processors (202), the generated report along with a first identifier at a caching layer (402).

2. The method as claimed in claim 1, wherein the step of monitoring further comprises continuously adapting, by the one or more processor (202), the data pulled and reports generated based on ongoing user interactions.

3. The method as claimed in claim 1, wherein the step of monitoring further comprises identifying, by the one or more processor (202), trends in report requests and adjusting the frequency of report generation in the caching layer (402) accordingly.

4. The method as claimed in claim 3, wherein the step of generating the reports further comprises adjusting, by the one or more processor (202), the frequency of report generation by prioritizing reports based on their historical frequency of use and execution time.

5. The method as claimed in claim 1, wherein the method further comprises evicting, by the one or more processors (202), reports from the caching layer (402) based on a decrease in user request frequency over a predetermined time.

6. The method as claimed in claim 1, wherein the method further comprises the step of:
transmitting, by the one or more processors (202), the generated report stored at the caching layer (402) to a user based on a user’s request.

7. The method as claimed in claim 6, wherein the step of, transmitting the generated report stored in the caching layer (402) to the user based on the user’s request, includes the steps of:
identifying, by the one or more processors (202), one or more attributes of a requested report based on the user’s request;
generating, by the one or more processors (202), a second identifier based on the identified one or more attributes; and
transmitting, by the one or more processors (202), the generated report to the user, if at least one of:
the second identifier matches with the first identifier; or
the one or more identified attributes match with the attributes of the generated report stored at the caching layer (402).

8. The method as claimed in claim 7, wherein the step of, transmitting the generated report stored in the caching layer (402) to a user based on the user’s request, further includes the step of:
tagging, by the one or more processors (202), the requested report to the respective user, if at least one of, the second identifier does not match with the first identifier, or the one or more identified attributes does not match with the attributes of the generated report stored at the caching layer (402).

9. The method as claimed in claim 1, wherein monitoring the historical behaviour includes the frequency of report requests, execution time, and usage of the reports.

10. A system (108) for generating reports, the system (108) comprising:
a monitoring unit (216) configured to monitor historical behaviour of one or more users regarding report generation;
a determination unit (218) configured to determine, using an Artificial Intelligence/Machine Learning (AI/ML) module (232), if the one or more users require report based on monitoring the historical behaviour of the one or more users;
a pulling unit (220) configured to pull, data from a database (214) to a reporting engine (224) based on detecting the one or more users require the report;
a reporting engine (224), configured to, generate, the report based on the data pulled from the database (214); and
a storing unit (226), configured to, store, the generated report along with a first identifier at a caching layer (402).

11. The system (108) as claimed in claim 10, wherein the monitoring unit (216) continuously adapts the data pulled and reports generated based on ongoing user interactions.

12. The system (108) as claimed in claim 10, wherein the monitoring unit (216) monitors by identifying, trends in report requests and adjusting the frequency of report generation in the caching layer (402) accordingly.

13. The system (108) as claimed in claim 10, wherein the reporting engine (224) generates the reports by adjusting, the frequency of report generation by prioritizing reports based on their historical frequency of use and execution time.

14. The system (108) as claimed in claim 10, wherein an evicting unit (228) evicts reports from the caching layer (402) based on a decrease in user request frequency over a predetermined time.

15. The system (108) as claimed in claim 10, wherein a transmitting unit (230) is configured to:
transmit the generated report stored at the caching layer (402) to a user based on a user’s request.

16. The system (108) as claimed in claim 15, wherein the transmitting unit (230) transmits the generated report stored in the caching layer (402) to a user based on the user’s request, by:
identifying, one or more attributes of a requested report based on the user’s request;
generating, a second identifier based on the identified one or more attributes; and
transmitting, the generated report to the user, if at least one of:
the second identifier matches with the first identifier; or
the one or more identified attributes match with the attributes of the generated report stored at the caching layer (402).

17. The system (108) as claimed in claim 16, wherein the transmitting unit (230) is further configured to:
tag, the requested report to the respective user, if at least one of, the second identifier does not match with the first identifier, or the one or more identified attributes does not match with the attributes of the generated report stored at the caching layer (402).

18. The system (108) as claimed in claim 10, wherein monitoring the historical behaviour includes the frequency of report requests, execution time, and usage of the reports.

19. A User Equipment (UE) (102-1), comprising:
one or more primary processors (305) communicatively coupled to one or more processors (202) of a system (108), the one or more primary processors (305) coupled with a memory (310), wherein said memory (310) stores instructions which when executed by the one or more primary processors (305) causes the UE (102-1) to:
transmit, one or more requests from a user to the one or more processors (202) for receiving one or more reports;
wherein the one or more processors (202) is configured to perform the steps as claimed in claim 1.

Documents

Application Documents

# Name Date
1 202321048725-STATEMENT OF UNDERTAKING (FORM 3) [19-07-2023(online)].pdf 2023-07-19
2 202321048725-PROVISIONAL SPECIFICATION [19-07-2023(online)].pdf 2023-07-19
3 202321048725-FORM 1 [19-07-2023(online)].pdf 2023-07-19
4 202321048725-FIGURE OF ABSTRACT [19-07-2023(online)].pdf 2023-07-19
5 202321048725-DRAWINGS [19-07-2023(online)].pdf 2023-07-19
6 202321048725-DECLARATION OF INVENTORSHIP (FORM 5) [19-07-2023(online)].pdf 2023-07-19
7 202321048725-FORM-26 [03-10-2023(online)].pdf 2023-10-03
8 202321048725-Proof of Right [08-01-2024(online)].pdf 2024-01-08
9 202321048725-DRAWING [18-07-2024(online)].pdf 2024-07-18
10 202321048725-COMPLETE SPECIFICATION [18-07-2024(online)].pdf 2024-07-18
11 Abstract-1.jpg 2024-09-30
12 202321048725-Power of Attorney [05-11-2024(online)].pdf 2024-11-05
13 202321048725-Form 1 (Submitted on date of filing) [05-11-2024(online)].pdf 2024-11-05
14 202321048725-Covering Letter [05-11-2024(online)].pdf 2024-11-05
15 202321048725-CERTIFIED COPIES TRANSMISSION TO IB [05-11-2024(online)].pdf 2024-11-05
16 202321048725-FORM 3 [03-12-2024(online)].pdf 2024-12-03
17 202321048725-FORM 18 [20-03-2025(online)].pdf 2025-03-20