Sign In to Follow Application
View All Documents & Correspondence

System And Method For Forecasting Events In A Network

Abstract: ABSTRACT SYSTEM AND METHOD FOR FORECASTING EVENTS IN A NETWORK The present invention relates to a system (108) and a method (600) for forecasting events in a network. The method (600) includes step of retrieving, data pertaining to the forecasting event from an inventory unit (206). The method (600) further includes the step of training a model with trends/patterns of the historic data; forecasting one or more events utilizing the trained model (211) on new data; retrieving one or more actual values subsequent to occurrence of the one or more events; and rendering at least one of, the forecasted one or more events and the one or more actual values to a user. The invention facilitates consumers to identify and understand discrepancies between predicted values and actual values and assisting in diagnosing and improving forecasting models. Ref. Fig. 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
06 October 2023
Publication Number
15/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

JIO PLATFORMS LIMITED
OFFICE-101, SAFFRON, NR. CENTRE POINT, PANCHWATI 5 RASTA, AMBAWADI, AHMEDABAD 380006, GUJARAT, INDIA

Inventors

1. Aayush Bhatnagar
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
2. Ankit Murarka
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
3. Jugal Kishore
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
4. Chandra Ganveer
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
5. Sanjana Chaudhary
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
6. Gourav Gurbani
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
7. Yogesh Kumar
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
8. Avinash Kushwaha
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
9. Dharmendra Kumar Vishwakarma
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
10. Sajal Soni
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
11. Niharika Patnam
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
12. Shubham Ingle
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
13. Harsh Poddar
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
14. Sanket Kumthekar
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
15. Mohit Bhanwria
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
16. Shashank Bhushan
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
17. Vinay Gayki
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
18. Aniket Khade
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
19. Durgesh Kumar
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
20. Zenith Kumar
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
21. Gaurav Kumar
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
22. Manasvi Rajani
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
23. Kishan Sahu
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
24. Sunil meena
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
25. Supriya Kaushik De
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
26. Kumar Debashish
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
27. Mehul Tilala
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
28. Satish Narayan
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
29. Rahul Kumar
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
30. Harshita Garg
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
31. Kunal Telgote
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
32. Ralph Lobo
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India
33. Girish Dange
Reliance Corporate Park, Thane - Belapur Road, Ghansoli, Navi Mumbai, Maharashtra 400701, India

Specification

DESC:
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003

COMPLETE SPECIFICATION
(See section 10 and rule 13)
1. TITLE OF THE INVENTION
SYSTEM AND METHOD FOR FORECASTING EVENTS IN A NETWORK
2. APPLICANT(S)
NAME NATIONALITY ADDRESS
JIO PLATFORMS LIMITED INDIAN OFFICE-101, SAFFRON, NR. CENTRE POINT, PANCHWATI 5 RASTA, AMBAWADI, AHMEDABAD 380006, GUJARAT, INDIA
3.PREAMBLE TO THE DESCRIPTION

THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE NATURE OF THIS INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.

FIELD OF THE INVENTION
[0001] The present invention relates generally to a wireless communication system. More particularly, the invention relates to the method and system for forecasting events in communication network.
BACKGROUND OF THE INVENTION
[0002] In traditional telecommunications networks, the complex machine learning models used for forecasting may be difficult to interpret. This may lead to a lack of consumer’s trust and understanding of the model's predictions.
[0003] The consumer may often need to compare the actual values with both the model's test predictions and forecasted predictions to assess model accuracy and reliability. Further, the consumers may struggle to gauge the accuracy of forecasting models without a clear visual representation of how well the predictions align with actual values.
[0004] There is, therefore, a dire need for an efficient system and method for forecasting events in communication network that ensures enhanced interpretability. Additionally, the users require a more interactive interface to explore and analyze forecasting results, allowing them to focus on specific data points or time periods of interest, thus improving user engagement and decision-making.
SUMMARY OF THE INVENTION
[0005] One or more embodiments of the present disclosure provide a method and a system for forecasting the events in a communication network.
[0006] In one aspect of the present invention, the method for forecasting the events in the communication network is disclosed. The method includes the step of retrieving, by one or more processors, data pertaining to the forecasting of the events. The method further includes the step of training a model with trends/patterns of the historic data. The training unit of the processor is configured to train the model with trends/patterns of the historic data utilizing the retrieved data pertaining to the operation of forecasting the events in a communication network. In an alternate embodiment, the training unit of the processor is configured to train the normalized data associated with the operation of the network.
[0007] Further, the method includes the step of forecasting, by one or more processors, utilizing the trained model, one or more events on new data. In one embodiment, the forecasting engine utilizes the trained model to generate the historical and current trends associated with the operation of the network.
[0008] In an embodiment, the step of forecasting, utilizing the trained model, one or more events on new data, includes the steps of receiving, by the one or more processors, new data or information of future time periods.
[0009] In an embodiment, the step of forecasting, by the one or more processors, utilizing the trained model, the one or more events based on the trends/patterns of the historic data in response to receiving the new data or the information of the future time periods. In an embodiment, the one or more actual values are the values when one or more events occurred.
[0010] Further, the method includes the step of retrieving one or more actual values subsequent to the occurrence of one or more events. The processor may fetch an actual value reserved for the test purpose. This is done to assess the trained model's performance using a portion of the data reserved for testing. This step ensures that the model generalizes well to unseen data.
[0011] In an embodiment, retrieving one or more actual values subsequent to the occurrence of the one or more events includes the steps of retrieving, by the one or more processors, information of the time period when the one or more events are forecasted and filtering, by the one or more processors, the one or more actual values that have a similar time period when the one or more events are forecasted.
[0012] In an embodiment, the method includes the step of rendering at least one of the forecasted one or more events and one or more actual values to a user. Said rendering unit of the processor is configured to notify the user on the user interface of the user equipment in real time of the forecasted events. In particular, rendering units forecast future network scenarios. The rendering unit has an interaction enablement unit that allows the user to interact with the graphical interface. The user may zoom in on specific time frames and data points for detailed information, hover over data points and toggle the visibility of different components (actual, predicted, and forecasted).
[0013] In yet another aspect of the present invention, a non-transitory computer-readable medium has stored thereon computer-readable instructions that, when executed by a processor, The processor is configured to retrieve the data pertaining to the operation of the network from an inventory unit. The processor is further configured to train a model with trends/patterns of the historic data. The processor is further configured to forecast, utilizing the trained model, one or more events and retrieve one or more actual values subsequent to the occurrence of the one or more events. Further, the processor is configured to render at least one of the forecasted one or more events and one or more actual values to a user.
[0014] Other features and aspects of this invention will be apparent from the following description and the accompanying drawings. The invention offers several key advantages, including enhanced interpretability, which provides a visual representation of forecasting results, making it easier for users to understand and trust the model's predictions. It also enables comparative analysis, allowing users to compare actual values with both the model's test predictions and forecasted predictions, facilitating a comprehensive assessment of model accuracy. Furthermore, it offers contextual understanding by providing a clear visual representation of how well predictions align with actual values, aiding in interpreting forecasting accuracy. The system helps users with error identification and understanding, enabling them to detect and comprehend discrepancies between predicted and actual values, which is crucial for diagnosing and improving forecasting models. Ultimately, this leads to improved decision-making, empowering users to make more informed choices based on the visual representation of forecasting results, enhancing business strategies and saving time and resources compared to manual analysis. The features and advantages described in this summary and in the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the relevant art, in view of the drawings, specifications, and claims hereof.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components, electronic components or circuitry commonly used to implement such components.
[0016] FIG. 1 is an exemplary block diagram of an environment for forecasting the events in a communication network, according to one or more embodiments of the present invention;
[0017] FIG. 2 is an exemplary block diagram of a system for forecasting the events in a communication network, according to one or more embodiments of the present invention;
[0018] FIG. 3 is an exemplary architecture of the system of FIG. 2, according to one or more embodiments of the present invention;
[0019] FIG. 4 is an exemplary architecture illustrating the flow for forecasting the events in a communication network, according to one or more embodiments of the present disclosure;
[0020] FIG. 5 is an exemplary signal flow diagram illustrating the flow for forecasting the events in a communication network, according to one or more embodiments of the present disclosure; and
[0021] FIG. 6 is a flow diagram of a method for forecasting the events in a communication network, according to one or more embodiments of the present invention.
[0022] The foregoing shall be more apparent from the following detailed description of the invention.
DETAILED DESCRIPTION OF THE INVENTION
[0023] Some embodiments of the present disclosure, illustrating all its features, will now be discussed in detail. It must also be noted that as used herein and in the appended claims, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise.
[0024] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure including the definitions listed here below are not intended to be limited to the embodiments illustrated but is to be accorded the widest scope consistent with the principles and features described herein.
[0025] A person of ordinary skill in the art will readily ascertain that the illustrated steps detailed in the figures and here below are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0026] The present invention discloses a system and a method for forecasting events in a communication network, leveraging historical data patterns and trends. The method begins by retrieving data relevant to event forecasting, followed by training a model using the historical data patterns. The processor's training unit is configured to utilize the retrieved data for this purpose. In an alternate embodiment, the training unit processes normalized network data. Once trained, the system forecasts future events by applying the trained model on new data. The forecasting engine generates predictions based on both historical and current trends within the network's operations. The system also incorporates real-time data, allowing the model to refine its predictions using recent or future time period information. Additionally, the system retrieves actual values post-event, using a portion of the data for testing to assess the model's performance, ensuring robust generalization to unseen data. The trained model of the system is configured to capture current trends, perform analysis, and predict or forecast requirements and/or potential issues in the inventory unit.
[0027] To this end, the present subject matter provides techniques for a more interactive interface to explore and analyze forecasting results, enabling consumers to focus on specific data points. The inventive step combines advanced machine learning forecasting models with a user-friendly graphical interface. This integration empowers consumers to visually interpret results, enhancing their understanding and decision-making. This dynamic approach bridges complex ML techniques with actionable, user-friendly insights, revolutionizing forecasting in various applications.
[0028] Referring to FIG. 1, FIG. 1 illustrates an exemplary block diagram of an environment 100 for forecasting the events in a communication network, according to one or more embodiments of the present invention. The environment 100 includes a User Equipment (UE) 102, a server 104, a network 106, and a system 108. A user interacts with the system 108 utilizing the UE 102.
[0029] For the purpose of description and explanation, the description will be explained with respect to one or more user equipment’s (UEs) 102, or to be more specific will be explained with respect to a first UE 102a, a second UE 102b, and a third UE 102c, and should nowhere be construed as limiting the scope of the present disclosure. Each of the at least one UE 102 namely the first UE 102a, the second UE 102b, and the third UE 102c is configured to connect to the server 104 via the network 106.
[0030] In an embodiment, each of the first UE 102a, the second UE 102b, and the third UE 102c is one of, but not limited to, any electrical, electronic, electro-mechanical or an equipment and a combination of one or more of the above devices such as smartphones, Virtual Reality (VR) devices, Augmented Reality (AR) devices, laptops, a general-purpose computer, desktop, personal digital assistant, tablet computer, mainframe computer, or any other computing device.
[0031] The network 106 includes, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, or some combination thereof. The network 106 may include, but is not limited to, a Third Generation (3G), a Fourth Generation (4G), a Fifth Generation (5G), a Sixth Generation (6G), a New Radio (NR), a Narrow Band Internet of Things (NB-IoT), an Open Radio Access Network (O-RAN), and the like.
[0032] The network 106 may also include, by way of example but not limitation, at least a portion of one or more networks having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, waves, voltage or current levels, some combination thereof, or so forth.
[0033] The environment 100 includes the server 104 accessible via the network 106. The server 104 may include, by way of example but not limitation, one or more of a standalone server, a server blade, a server rack, a bank of servers, a server farm, hardware supporting a part of a cloud service or system, a home server, hardware running a virtualized server, a processor executing code to function as a server, one or more machines performing server-side functionality as described herein, or at least a portion of any of the above, or some combination thereof. In an embodiment, the entity may include, but is not limited to, a vendor, a network operator, a company, an organization, a university, a lab facility, a business enterprise side, a defence facility side, or any other facility that provides service.
[0034] The environment 100 further includes the system 108 communicably coupled to the server 104, and the UE 102 via the network 106. The system 108 is adapted to be embedded within the server 104 or is embedded as the individual entity.
[0035] Operational and construction features of the system 108 will be explained in detail with respect to the following figures.
[0036] FIG. 2 is an exemplary block diagram of the system 108 for managing the operations in the network 106, according to one or more embodiments of the present invention.
[0037] As per the illustrated and preferred embodiment, the system 108 for forecasting the events in a communication network 106, includes one or more processors 202, a memory 204, and an inventory unit 206. The one or more processors 202, hereinafter referred to as the processor 202, may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, single board computers, and/or any devices that manipulate signals based on operational instructions. However, it is to be noted that the system 108 may include multiple processors as per the requirement and without deviating from the scope of the present disclosure. Among other capabilities, the processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 204.
[0038] As per the illustrated embodiment, the processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 204 as the memory 204 is communicably connected to the processor 202. The memory 204 is configured to store one or more computer-readable instructions or routines in a non-transitory computer-readable storage medium, which may be fetched and executed to manage operations in the network 106. The memory 204 may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as disk memory, EPROMs, FLASH memory, unalterable memory, and the like.
[0039] As per the illustrated embodiment, the inventory unit 206 is configured to store data associated with forecasting the events in a communication network 106. The inventory unit 206 is one of, but not limited to, the Unified Inventory Management (UIM) unit, a centralized database, a cloud-based database, a commercial database, an open-source database, a distributed database, an end-user database, a graphical database, a No-Structured Query Language (NoSQL) database, an object-orientated database, a personal database, an in-memory database, a document-based database, a time series database, a wide column database, a key value database, a search database, a cache database, and so forth. The foregoing examples of inventory unit 206 types are non-limiting and may not be mutually exclusive, e.g., the database can be both commercial and cloud-based, or both relational and open-source, etc.

[0040] In one embodiment, the inventory unit 206, such as, a Unified Inventory Management (UIM) unit is a standard based telecommunications inventory management application that enables users to model and manage customers, services, and resources. The UIM unit serves as the backbone of the network 106. The inventory unit 206 stores the logical and physical inventory data of every asset, device, node, and application. In particular, the inventory unit 206 serves as a central repository for storing customer related information. The customer related information includes at least one of, but not limited to, a name, an address, a location, a mobile number, subscription plans and a number of customers in the network 106.
[0041] As per the illustrated embodiment, the system 108 includes the processor 202 to manage operations in the network 106. The processor 202 includes a retrieving unit 208, a training unit 210, a forecasting engine 212, a rendering unit 216, an interaction enabling unit 218, and display device 220.
[0042] The processor 202 is communicably coupled to the one or more components of the system 108 such as the inventory unit 206, and the memory 204. In an embodiment, operations and functionalities of a retrieving unit 208, a training unit 210, a trained model 211, a forecasting engine 212, a rendering unit 216, an interaction enabling unit 218, and display device 220, and the one or more components of the system 108 can be used in combination or interchangeably.
[0043] In an embodiment, the retrieving unit 208 of the processor 202 is configured to retrieve the data pertaining to the forecasting the events in a communication network 106 from the inventory unit 206. In one embodiment, the data is at least one of, but not limited to a number of customers available in the network 106 at a given instant. In particular, the operation corresponds to at least one of, but not limited to, a customer provisioning order and a customer migration order. In one embodiment, the customer provisioning order is the process of adding customers to the network 106 and assigning telephony services to the customers. The customer provisioning order is also referred to a subscriber provisioning. In one embodiment, the customer migration order is the process of transferring customers from an existing subscriber management system to another system.
[0044] In an embodiment, the retrieving unit 208 of the processor 202 is configured to retrieve historic data from one or more data sources 302 for the forecasting task. The historic data may include time-series information, such as past sales, temperatures, or any other relevant metrics.
[0045] Said one or more data sources include at least one of, the data sources within a telecommunication network 302a and the data sources outside the telecommunication network 302b. The telecommunication network 302a includes at least one of network performance data, subscriber data, and device data. The network performance data provides historical data that includes metrics such as latency, bandwidth usage, packet loss, and error rates to provide the operational status of the telecommunications network. The subscriber data includes data about users of the telecommunications services, such as account details, usage patterns, billing information, and service preferences. The device data refers to data collected from various devices connected to the network, such as mobile phones, routers, and internet of things (IoT) devices. The device data includes information on device performance, connectivity status, and usage statistics, which are essential for managing network resources effectively. These internal data are vital for the telecommunications provider to maintain service quality, optimize network performance, and enhance user experience.
[0046] The outside of the telecommunication network 302b include at least one of competitor data, social media data, customer feedback, and surveys. The competitor data includes information regarding competitors' performance, pricing strategies, and market positioning. This data is crucial for benchmarking and developing competitive strategies. The social media data includes data gathered from social media platforms that can reveal customer sentiment, trends, and public perception of the telecommunications services. The customer feedback and surveys include direct feedback from customers, collected through surveys, reviews, and other channels.
[0047] In one embodiment, the retrieving unit 208 may utilize one of techniques such as, but not limited to, Database Extraction, ETL (Extract, Transform, Load) Tools, Application Programming Interface (API) Integration, Web Scraping, Real-Time Data Streaming, and Query Languages to retrieve the data from the one or more data sources.
[0048] In the database extraction technique, the retrieving unit 208 connects to a database using a database client or programming language to execute queries and retrieve data. For instance, the retrieving unit 208 pulls historic data, such as time-series information, such as past sales, temperatures, or any other relevant metrics from the one or more data sources. This data is crucial for analyzing customer behaviour and identifying usage patterns. The ETL tools are utilized to extract data from multiple data sources, handling various data formats and making them ideal for analyzing the consolidated data. For example, the ETL tool connects to network management systems via APIs to pull real-time performance metrics, extract billing data from SQL databases, and scrape social media platforms for customer feedback.

[0049] The API integration allows the retrieving unit 208 to access data from data services by making HTTP requests to API endpoints, enabling real-time data retrieval. For instance, the retrieving unit 208 pulls logs from cloud services or third-party monitoring tools via their APIs. The Web scraping involves writing scripts to extract data from web pages. For example, extracting data such as pricing of data plan, data limits, contract terms, and promotional offers from web pages. The real-time data streaming techniques facilitate continuous data retrieval from sources that provide live feeds, such as IoT devices or social media platforms. Technologies like Apache Kafka or AWS Kinesis may be employed to manage these streams, allowing applications to process and analyze data as it arrives.
[0050] In an alternate embodiment, subsequent to retrieving the stored data pertaining to the operation of the network 106 from the inventory unit 206, the processor 202 is configured to normalize the data retrieved by retrieving unit 208. In particular, the processor 202 may include a normalizer to preprocess the retrieved data. The normalizer performs at least one of, but not limited to, data normalization. The data normalization is the process of at least one of, but not limited to, reorganizing the retrieved data, removing the redundant data within the retrieved data, formatting the retrieved data and removing null values from the retrieved data. The main goal of the the normalizer is to achieve a standardized data format across the entire system 108. The normalizer ensures that the normalized data is stored appropriately in the inventory unit 206 for subsequent retrieval and analysis. In one embodiment, the data retrieved by the retrieving unit 208 is normalized by the normalizer of the processor 202.
[0051] In an embodiment, the training unit 210 of the processor 202 is configured to train a model with trends/patterns of the historic data utilizing the retrieved data pertaining to the operation of forecasting the events in a communication network 106. In an embodiment, in order to train the model, the training unit 210 first selects the model from a plurality of models depending on a type of the historic data selected to train the model. For example, if the type of the historic data selected to train the model is related to alarm data, an appropriate model is selected from the plurality of models which is suitable for training with the historic data of the type related to the alarm data. In an alternate embodiment, a user may select the model for training from the plurality of available models based on the type of historic data selected to train the model. In an alternate embodiment, the training unit 210 of the processor 202 is configured to train the normalized data associated with the operation of the network 106.
[0052] For example, the training unit 210 trains the model with trends and patterns of the historic data. The process of training the model is completed by identifying trends/patterns from the historic data and enabling the model to learn the identified trends/patterns of the historic data.
[0053] Further, the training unit 210 is configured to assess the performance of the trained model by feeding the trained model with test data. For example, the performance of the trained model, a variety of evaluation measures are accessible. Said evaluation model may include F1-score, ROC curves, accuracy, precision, and recall for classification. The user might use R-squared, mean squared error (MSE), or root mean squared error (RMSE) for regression. Further, to ascertain whether your model performs to the necessary level and is prepared for deployment in practical applications on real data, the user may analyze these metrics.
[0054] While training, the model tracks and monitors the retrieved data pertaining to the operation of forecasting the events in a communication network. Further, the model learns at least one of, but not limited to, trends and patterns associated with the operation of the network 106. For example, the training unit 210 of the system 108 selects an appropriate model, such as at least one of, but not limited to, a time series models, regression models or ensemble models or decision tree models, from a set of available options of the model depending on the type of historic data selected to train the model. Thereafter, the selected model is trained using the normalized data. In one embodiment, the selected model is trained on historical data associated with the operation of the network 106.
[0055] In an embodiment, the trained model 211 is at least one of, but not limited to a generative Artificial/Intelligence (AI) model. In particular, the generative AI model utilizes at least one of, but not limited to, a deep learning, neural networks, and a machine learning to forecast one or more events on new data.
[0056] In an embodiment, upon training, the trained model 211 is utilized by the forecasting engine 212 of the processor 202 and is configured to forecast one or more events. For example, the forecasting engine 212 of the processor 202 utilizing the trained model 211, forecasts the one or more events on new data, by receiving new data or information of future time periods; and the forecasting engine 212 utilising the trained model 211, forecasts the one or more events based on the trends/patterns of the historic data in response to receiving the new data (recent data) or the information of the future time periods.
[0057] The forecasting engine 212 is a system or tool that utilizes historical data, and statistical models to predict future events or trends. Said engines 212 can forecast various scenarios, such as sales figures, market demand, weather conditions, or equipment failures, by identifying patterns within past data and extrapolating them into the future.
[0058] In an embodiment, the forecasting engine 212 by the use of the trained model 211 may use in the industry of the supply chain management for predicting inventory levels or demand fluctuations; finance for predicting stock prices or market trends and it is very crucial for accurate predictions.
[0059] In one embodiment, the future trends are general changes in one variable compared to another variable over a period of time. For example, the trends pertain to changes in the number of the plurality of customers within the network 106. In particular, based on training, the trained model 211 enables the forecasting engine 212 to generate at least one of the historical trends and current trends associated with the operation of the network 106 by applying one or more logics. Herein, the historical trends and the current trends correspond to at least one of the number of customers available in at least one of the circles of the network 106 and the overall network 106.
[0060] In one embodiment, the one or more logics may include at least one of, but not limited to, a k-means clustering, a hierarchical clustering, a Principal Component Analysis (PCA), an Independent Component Analysis (ICA), a deep learning logics such as Artificial Neural Networks (ANNs), a Convolutional Neural Networks (CNNs), a Recurrent Neural Networks (RNNs), a Long Short-Term Memory Networks (LSTMs), a Generative Adversarial Networks (GANs), a Q-Learning, a Deep Q-Networks (DQN), a Reinforcement Learning Logics, etc.
[0061] Further, the retrieving unit 208 of the processor 202 is configured to retrieve one or more actual values subsequent to the occurrence of one or more events by retrieving information about the time period when the one or more events are forecasted and filtering the one or more actual values that have a similar time period when the one or more events are forecasted. Said one or more actual values are values when one or more events occurred. For example, if the event is forecasting a spike in temperature, then the retrieving process for the for the forecasted event occurs at 3 PM. The retrieving unit 208 identifies the time period between 2:30 PM and 3:30 PM. It queries actual temperature readings within this window. Actual values retrieved might show temperatures recorded at 2:45 PM, 3:00 PM, and 3:15 PM, which can then be compared to the forecasted spike.
[0062] In one embodiment, the rendering unit 216 collects the actual values for the same time periods covered by the forecasts, which are the true values against which the model's predictions will be compared. The rendering unit 216 integrates the collected values with a graphical interface. The graphical interface accommodates the presentation of actual values, test predictions, and forecasted predictions. This interface will serve as the visual platform for users to analyze the results. The rendering unit 216 further includes an interaction enabling unit 218, wherein the said interaction enabling unit 218 is configured be communicably connected with a display device 220. The display device 220 populates the interface with the actual values, model predictions, and forecasted values with the help of the interaction enabling unit 218. Each point on the graph represents a specific time period, with the corresponding value. The users can interact with the graphical interface. They may zoom in on specific time frames, data points for detailed information, hover over data points and toggle the visibility of different components (actual, predicted, forecasted).
[0063] For example, the forecasting engine 212 predicts the trend pertaining to the number of the customers that will be added in the network 106 in future based on which network operators can take one or more actions such as at least one of, but not limited to, time-series information, such as past sales, temperatures, or any other relevant metrics.
[0064] The retrieving unit 208, the training unit 210, the trained model 211, the forecasting engine 212, the rendering unit 216, and the display 220 are implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor 202. In the examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor 202 may be processor-executable instructions stored on a non-transitory machine-readable storage medium, and the hardware for the processor may comprise a processing resource (for example, one or more processors) to execute such instructions. In the present examples, the memory 204 may store instructions that, when executed by the processing resource, implement the processor 202. In such examples, the system 108 may comprise the memory 204 storing the instructions and the processing resource to execute the instructions, or the memory 204 may be separate but accessible to the system 108 and the processing resource. In other examples, the processor 202 may be implemented by electronic circuitry.
[0065] FIG. 3 illustrates an exemplary architecture for the system 108, according to one or more embodiments of the present invention. More specifically, FIG. 3 illustrates the system 108 for forecasting the events in the communication network. It is to be noted that the embodiment with respect to FIG. 3 will be explained with respect to the UE 102 for the purpose of description and illustration and should nowhere be construed as limited to the scope of the present disclosure.
[0066] FIG. 3 shows communication between the UE 102, and the system 108. For the purpose of description of the exemplary embodiment as illustrated in FIG. 3, the UE 102, uses network protocol connection to communicate with the system 108. In an embodiment, the network protocol connection is the establishment and management of communication between the UE 102, and the system 108 over the network 106 (as shown in FIG. 1) using a specific protocol or set of protocols. The network protocol connection includes, but not limited to, Session Initiation Protocol (SIP), System Information Block (SIB) protocol, Transmission Control Protocol (TCP), User Datagram Protocol (UDP), File Transfer Protocol (FTP), Hypertext Transfer Protocol (HTTP), Simple Network Management Protocol (SNMP), Internet Control Message Protocol (ICMP), Hypertext Transfer Protocol Secure (HTTPS) and Terminal Network (TELNET).
[0067] In an embodiment, the UE 102 includes a primary processor 303, and a memory 304 and a Graphical User Interface (UI) 306. In alternate embodiments, the UE 102 may include more than one primary processor 302 as per the requirement of the network 106. In one embodiment, the user may configure to interact with the Graphical User Interface (GUI) 306 of the UE, wherein the user interacts with the GUI 306 to at least one of, zoom in to the specific time period of the data point, hover over data points and toggle visibility of the one or more parameters including actual, forecasted and predicted. The primary processor 303, may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, single board computers, and/or any devices that manipulate signals based on operational instructions.
[0068] In an embodiment, the primary processor 303 is configured to fetch and execute computer-readable instructions stored in the memory 304. The memory 304 may be configured to store one or more computer-readable instructions or routines in a non-transitory computer-readable storage medium, which may be fetched and executed to manage operations for forecasting the events in the communication network. The memory 304 may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as disk memory, EPROMs, FLASH memory, unalterable memory, and the like.
[0069] In an embodiment, the User Interface (UI) 306 includes a variety of interfaces, for example, a graphical user interface, a web user interface, a Command Line Interface (CLI), and the like. The User Interface (UI) 306 allows the user to transmit the request to the system 108 for performing the operation. In one embodiment, the user may include at least one of, but not limited to, a network operator. In accordance with the exemplary embodiment, let us assume the inventory unit 206 is hosted on the server 104. The inventory unit 206 is configured to respond to all the requests received from the UE 102. Based on the requests, the inventory unit 206 performs the operation such as storing the customers details. Further, the data pertaining to the operation performed by the inventory unit 206.
[0070] Further, FIG. 3 shows that the processor 202 establishes the one or more connections with the one or more data sources 302. The processor 202 is a critical component in the system 108 designed to interact with various data sources, referred to as one or more data sources 302, analyze data, and detect anomaly by utilizing the at least one trained model 211 with respect to training statistics data and performance data for each of the plurality of trained models. Further, the processor 202, enables a trained model 211 to learn the historic performance of the one or more anomaly detection with respect to the similar data. The result of this learnt historic performance data are stored in the inventory unit 206.
[0071] In an embodiment, the connection of the processor 202 with the one or more data sources 104 is facilitated through the use of Application Programming Interfaces (APIs). The APIs used can vary in type, including RESTful APIs, SOAP APIs, or other custom APIs designed for specific data sources. Each type has its own set of rules and protocols for communication, which the processor 202 must adhere to when establishing connections. By using APIs, the processor 202 is configured to connect with a wide range of data sources, regardless of their underlying technology or architecture. This interoperability is essential for modern systems that rely on diverse data inputs. The ability to establish multiple connections through the APIs allows the system to scale efficiently. As the demand for data increases, the processor 202 can connect to additional data sources without significant changes to the underlying architecture. The APIs enable real-time access to data, allowing the processor 202 to retrieve the most current information from the data sources 302. This is particularly important for applications that require up-to-date data for decision-making or analysis. The APIs often include authentication and authorization mechanisms, ensuring that only authorized users or systems can access the data sources 302. This adds a layer of security to the data exchange process.
[0072] The one or more data sources 302 includes, by way of example but not limited to, telecommunication network 302a and outside of the telecommunication network 302b. The telecommunication network 302a includes at least one of network performance data, subscriber data, and device data. The outside of the telecommunication network 302b include at least one of competitor data, social media data, customer feedback, and surveys.
[0073] As mentioned earlier in FIG.2, the system 108 includes the processors 202, the memory 204, and the inventory unit 206, for forecasting the events of the communication network, which are already explained in FIG. 2. For the sake of brevity, a similar description related to the working and operation of the system 108 as illustrated in FIG. 2 has been omitted to avoid repetition.
[0074] The limited description provided for the system 108 in FIG. 3 should be read with the description provided for the system 108 in FIG. 2 above and should not be construed as limiting the scope of the present disclosure.
[0075] FIG. 4 is an architecture illustrating the flow for forecasting the events of the communication network, according to one or more embodiments of the present disclosure.
[0076] In one embodiment, the architecture 400 includes a user equipment (UE) 102, a workflow manager (WFM) 404, a database 406, the trained model 211, a forecast output generation 410, and the visualization unit 412. Initially, the UE 102 transmits the request to the WFM 404 in order to perform a particular operation. Herein, the WFM unit 404 is the inventory unit 206 configured to generate the forecasting output generation. Said workflow manager unit 404 is a crucial component in systems that automate and streamline processes, ensuring that tasks are executed efficiently and in the correct order. Said database 406 is configured to store data and provides fast access to and updating of data. As per the illustrated embodiment, the database 406 is specifically configured to store data associated with the operation performed in architecture 400. The database 406 is one of, but not limited to, the Unified Inventory Management (UIM) unit, a centralized database, a cloud-based database, a commercial database, an open-source database, a distributed database, an end-user database, a graphical database, a No-Structured Query Language (NoSQL) database, an object-orientated database, a personal database, an in-memory database, a document-based database, a time series database, a wide column database, a key value database, a search database, a cache database, and so forth. The foregoing examples of database 406 types are non-limiting and may not be mutually exclusive, e.g., the database can be both commercial and cloud-based, or both relational and open-source, etc.
[0077] In an exemplary embodiment, the database 406 serves as a central hub for all the data associated with system 108, providing unified and accessible data for analysis. The database 406 is configured to accommodate various data types, including structured data such as databases, semi-structured data such as JSON or XML files, and unstructured data such as text documents, images, and videos. This capability is crucial for telecommunications systems, which generate a wide array of data from different sources.

[0078] Further, the data fed to the trained model 211 for training. Thereafter, the trained model 211 generates the historical trends and the current trends pertaining to the operation and detects the pattern between the generated historical trends and the current trends. Based on the detected pattern, the trained model 211 predicts the future trends pertaining to the operation associated with the WFM unit 404. For example, the trained model 211 predicts future trends pertaining to the overall growth of the customers in the network 106 or circle wise customers growth in the network 106 based on the operation performed by the WFM unit 404.
[0079] The forecast output generation 410 utilizing the trained model 211 with new data to generate forecasts. This involves inputting recent or future time periods to obtain predictions. The forecast output generation 410 is a critical component of the forecasting system, responsible for producing actionable insights based on predictive models. The primary goal of this unit is to take the results from predictive models and generate forecasts that can be used for decision-making in various applications, such as supply chain management, finance, and resource planning.
[0080] The visualization unit 412 integrates with forecast output generation 410 to get forecasted data and by combining multiple visualization API, plots the output on UI 102. The UI 102 is configured to have a graphical interface that accommodates the presentation of actual values, test predictions, and forecasted predictions. Said interface may serve as the visual platform for users to analyze the results.
[0081] FIG. 5 is a signal flow diagram illustrating the flow for forecasting the events in a communication network, according to one or more embodiments of the present disclosure.
[0082] At step 502, the UE 102 transmits the request to the inventory unit 206 in order to perform the forecasting the events in a communication network.
[0083] At step 504, the inventory unit 206 stores the data related to the operations to performing the operations of the forecasting the events in a communication network. In one embodiment, the inventory unit 206 stores the data related to the operations in any of the appropriate predefined format. Said data may be historic data, where said historic data may include time-series information, such as past sales, temperatures, or any other relevant metrics.
[0084] At step 506, the inventory unit 206 transmits the data related to the operations to the training unit 210, where the training unit 210 trains the trained model 211 of the processor 202. In alternate embodiment, the processor 202 retrieves the data related to the operations from the inventory unit 206.
[0085] At step 508, the forecasting engine 212 of the processor 202, utilizing the trained model 211, extract trends/patterns from the historic data; and enabling the model to learn the identified trends/patterns of the historic data. The forecasting engine 212 is configured to predict future events or trends, and also forecast various scenarios, such as sales figures, market demand, weather conditions, or equipment failures, by identifying patterns within past data and extrapolating them into the future.
[0086] At step 510, the forecast output generation unit 410 utilizing the trained model 211 on new data to generate forecasts in future time periods. The primary goal of this unit is to take the results from predictive models and generate forecasts that can be used for decision-making in various applications, such as supply chain management, finance, and resource planning.
[0087] At step 512, the actual values are collected by the processor 202 for the same time periods covered by the forecasts, which are the true values against which the model's predictions will be compared.
[0088] At step 514, the processor 212 transmit predicted future trends to the visualization unit 412. At step 516, the visualization unit 412 populate the interface with the actual values, model predictions, and forecasted values, and transmits to the UE 102. Said UE 102 display each point on the graph represents a specific time period, with the corresponding value. The graphical interface of the UE 102 allows the users to zoom in on specific time frames, data points for detailed information, and toggle the visibility of different components (actual, predicted, forecasted). Based on the predicted future trends, the user may plan for the one or more actions in order to avoid the degradation of the network 106.
[0089] FIG. 6 is a flow diagram of a method 600 for forecasting the events in the communication network, according to one or more embodiments of the present invention. For the purpose of description, the method 600 is described with the embodiments as illustrated in FIG. 2 and should nowhere be construed as limiting the scope of the present disclosure.
[0090] At step 602, the method 600 includes the step of retrieving, by the one or more processors, historic data from one or more data sources. In one embodiment, retrieving unit 208 of the processor 202 is configured to receive data from one or more data sources include at least one of, the data sources within a telecommunication network and the data sources outside the telecommunication network.
[0091] For instance, the retrieving unit 208 pulls historic data such as time-series information, such as past sales, temperatures, or any other relevant metrics from the one or more data sources. This data is crucial for analyzing customer behavior and identifying usage patterns. The ETL tools are utilized to extract data from multiple data sources, handling various data formats and making them ideal for analyzing the consolidated data. For example, the ETL tool connects to network management systems via APIs to pull real-time performance metrics, extract billing data from SQL databases, and scrape social media platforms for customer feedback.
[0092] At step 604, the method 600 includes the step of training, by the one or more processors, a model with trends/patterns of the historic data. In one embodiment, the training unit 210 of the processor 202 is configured to train the model with trends/patterns of the historic data utilizing the retrieved data pertaining to the operation of forecasting the events in a communication network 106. In an alternate embodiment, the training unit 210 of the processor 202 is configured to train the normalized data associated with the operation of the network 106.
[0093] In an embodiment, training a model with trends/patterns of the historic data, includes the step of identifying trends/patterns from the historic data; and enabling the model to learn the identified trends/patterns of the historic data through the utilization of one or more processor. Further, training of the model includes the step of assessing performance of the trained model by feeding the trained model with test data. The performance assessment of the training is completed by using the one or more processors.
[0094] At step 606, the method 600 includes the step of forecasting, by the one or more processors, utilizing the trained model 211, one or more events. In one embodiment, the forecasting engine 212 utilizes the trained model 211 to generate the historical and the current trends associated with the operation of the network 106.
[0095] In an embodiment, the step of forecasting, utilizing the trained model, one or more events, incudes the steps of receiving, by the one or more processors, new data or information of future time periods.
[0096] In an embodiment, the step of forecasting, by the one or more processors, utilizing the trained model, the one or more events based on the trends/patterns of the historic data in response to receiving the new data or the information of the future time periods. In an embodiment, the one or more actual values are the values when one or more events occurred.
[0097] For example, the forecasting engine 212 generate trends pertaining to the number of customers added in the network 106 in last three months and the number of customers added in the network 106 in a current month.
[0098] At step 608, the method 600 includes the step of retrieving, by the one or more processors, one or more actual values subsequent to occurrence of the one or more events. The processor may fetch an actual value reserved for the test purpose. This is done to assess the trained model's performance using a portion of the data reserved for testing. This step ensures that the model generalizes well to unseen data.
[0099] In an embodiment, retrieving, one or more actual values subsequent to occurrence of the one or more events, includes the steps of retrieving, by the one or more processors, information of time period when the one or more events are forecasted; and filtering, by the one or more processors, the one or more actual values which have a similar time period when the one or more events are forecasted.
[00100] At step 610, the method 600 includes the step of rendering, by the one or more processors, at least one of, the forecasted one or more events and the one or more actual values to a user. In one embodiment, the rendering unit 216 of the processor 202 is configured to notify the user on the user interface 306 of the UE 102 in a real time the forecasted events. In particular, rendering unit 216 forecast future network scenarios. For example, forecasts outcomes for the next few days such as 10 days are notified to the user. Further, the rendering unit 216 has an interaction enablement unit 218 which allows the interact with the graphical interface. The user may zoom in on specific time frames, data points for detailed information, and toggle the visibility of different components (actual, predicted, forecasted).
[00101] In an embodiment, the step of rendering the forecasted one or more events and the one or more actual values to a user. Further includes the steps of displaying, by the one or more processors, on a display device, data points pertaining to at least one of, the forecasted one or more events, the one or more actual values, testing data predictions and forecasted values, wherein each data point represents a specific time period and a corresponding value.
[00102] In an embodiment, the processors enable a user to interact with the display device to zoom in to the specific time period of the data point and toggle visibility of one or more parameters including actual, forecasted and predicted.
[00103] The present invention further discloses a non-transitory computer-readable medium having stored thereon computer-readable instructions. The computer-readable instructions are executed by the processor 202. The processor 202 is configured to retrieve the data pertaining to the operation of the network 106 from an inventory unit 206. The processor 202 is further configured to train a model 211 with trends/patterns of the historic data. The processor 202 is further configured forecast, utilizing the trained model, one or more events; retrieve, one or more actual values subsequent to occurrence of the one or more events. Further, the processor 202 is configured to render at least one of the forecasted one or more events and the one or more actual values to a user.
[00104] A person of ordinary skill in the art will readily ascertain that the illustrated embodiments and steps in description and drawings (FIG.1-6) are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[00105] The present disclosure provides technical advancement including forecasting network performance and providing an anomaly detection feature. The technique reduces the burden of the manual work to detect the anomalies which leads to time saving of the network operators/administrator. The technique utilizes a trained model to identify trends for a specific Network, procedure, or clear code that does not fit the normal pattern. Furthermore, the techniques enable proactive monitoring of the network elements in the communication network. The system learns from the feed, identifies deviations from past behavior, and notifies anomalies to the network operators/administrator.
[00106] The present invention offers multiple advantages over the prior art and the above listed are a few examples to emphasize on some of the advantageous features. The listed advantages are to be read in a non-limiting manner.

REFERENCE NUMERALS
[00107] Environment - 100;
[00108] User Equipment (UE) - 102;
[00109] Server - 104;
[00110] Network- 106;
[00111] System -108;
[00112] Processor - 202;
[00113] Memory - 204;
[00114] Inventory unit – 206;
[00115] Retrieving unit – 208;
[00116] Training unit – 210;
[00117] Trained model– 211;
[00118] Forecasting Engine – 212;
[00119] Rendering unit - 216;
[00120] Interaction Enabling Unit – 218;
[00121] Display Device – 220;
[00122] One or more data sources - 302;
[00123] Telecommunication network - 302a;
[00124] Outside of the telecommunication network - 302b;
[00125] Primary Processor – 303;
[00126] Memory – 304;
[00127] User Interface (UI) – 306;
[00128] Database – 406,
[00129] Work flow manager – 404;
[00130] Forecast Output Generation – 410;
[00131] Visualization Unit – 412.

,CLAIMS:CLAIMS
We Claim:
1. A method (600) for forecasting events in a network, the method (600) comprising the steps of:
retrieving, by the one or more processors (202), historic data from one or more data sources;
training, by the one or more processors (202), a model with trends/patterns of the historic data;
forecasting, by the one or more processors (202), utilizing the trained model, one or more events on new data;
retrieving, by the one or more processors (202), one or more actual values subsequent to occurrence of the one or more events; and
rendering, by the one or more processors (202), at least one of, the forecasted one or more events and the one or more actual values to a user.

2. The method (600) claimed in claim 1, wherein the one or more data sources (302) include at least one of, the data sources within a telecommunication network (302a) and the data sources outside the telecommunication network (302b).

3. The method (600) as claimed in claim 1, wherein the step of, training, a model with trends/patterns of the historic data, includes the step of:
identifying, by the one or more processors (202), utilizing the model, trends/patterns from the historic data; and
enabling, by the one or more processors (202), the model to learn the identified trends/patterns of the historic data.

4. The method (600) as claimed in claim 1, wherein the step of, training, a model (211) with the historic data, further includes the step of:
assessing, by the one or more processors (202), performance of the trained model (211) by feeding the trained model (211) with test data.

5. The method (600) as claimed in claim 1, wherein the step of, forecasting, utilizing the trained model (211), one or more events, on new data, incudes the steps of:
receiving, by the one or more processors (202), new data or information of future time periods;
forecasting, by the one or more processors (202), utilizing the trained model (211), the one or more events on the new data based on the trends/patterns of the historic data in response to receiving the new data or the information of the future time periods.

6. The method (600) as claimed in claim 1, wherein the one or more actual values are values when one or more events occurred.

7. The method (600) as claimed in claim 1, wherein the step of, retrieving, one or more actual values subsequent to occurrence of the one or more events, includes the steps of:
retrieving, by the one or more processors (202), information of time period when the one or more events are forecasted; and
filtering, by the one or more processors (202), the one or more actual values which have a similar time period when the one or more events are forecasted.

8. The method (600) as claimed in claim 1, wherein the step of, rendering, at least one of, the forecasted one or more events and the one or more actual values to a user, further includes the steps of:
displaying, by the one or more processors (202), on a User Equipment (UE), data points pertaining to at least one of, the forecasted one or more events, the one or more actual values, testing data predictions and forecasted values, wherein each data point represents a specific time period and a corresponding value.

9. The method (600) as claimed in claim 8, wherein the one or more processors (202), enables a user to interact with the User Equipment (UE) to at least one of, zoom in to the specific time period of the data point, hover over data points and toggle visibility of one or more parameters including actual, forecasted and predicted.

10. A system (108) for forecasting events in a network (106), the system (108) comprising:
a retrieving unit (208), configured to, retrieve, historic data from one or more data sources (302);
a training unit (210), configured to, train, a model (211) with trends/patterns of the historic data;
a forecasting engine (212), configured to, forecast, utilizing the trained model (211), one or more events on new data;
the retrieving unit (208), configured to, retrieve, one or more actual values subsequent to occurrence of the one or more events; and
a rendering unit (216), configured to, render, at least one of, the forecasted one or more events and the one or more actual values to a user.

11. The system (108) as claimed in claim 10, wherein the one or more data sources (302) include at least one of, the data sources within a telecommunication network (302a) and the data sources outside the telecommunication network (302b).

12. The system (108) as claimed in claim 10, wherein the training unit (210), trains the model (211) with trends/patterns of the historic data, by:
identifying, utilizing the model (211), trends/patterns from the historic data; and
enabling, the model (211) to learn the identified trends/patterns of the historic data.

13. The system (108) as claimed in claim 10, wherein the training unit (210) is further configured to:
assess, performance of the trained model (211) by feeding the trained model (211) with test data.

14. The system (108) as claimed in claim 10, the forecasting engine (212), forecasts, utilizing the trained model (211), one or more events, by:
receiving, new data or information of future time periods;
forecasting, utilizing the trained model (211), the one or more events on the new data based on the trends/patterns of the historic data in response to receiving the new data or the information of the future time periods.

15. The system (108) as claimed in claim 10, wherein the one or more actual values are values when one or more events occurred.

16. The system (108) as claimed in claim 10, wherein the retrieving unit (208), retrieves, one or more actual values subsequent to occurrence of the one or more events, by:
retrieving, information of time period when the one or more events are forecasted; and
filtering, the one or more actual values which have a similar time period when the one or more events are forecasted.

17. The system (108) as claimed in claim 10, wherein the rendering unit (216), is further configured to:
display, on a User Equipment (UE), data points pertaining to at least one of, the forecasted one or more events, the one or more actual values, testing data predictions and forecasted values, wherein each data point represents a specific time period and a corresponding value.

18. The system (108) as claimed in claim 17, wherein an interaction enabling unit (218) is configured to enable the user to interact with the User Equipment to at least one of, zoom in to the specific time period of the data point, hover over data points and toggle visibility of one or more parameters including actual, forecasted and predicted.

19. A User Equipment (UE) (102), comprising:
one or more primary processors (303), communicatively coupled to one or more processors (202) in a network (106), wherein the one or more primary processors (303) are coupled with a memory (304) stores instructions, when executed by the one or more primary processors (303), cause the UE (102) to:
enable, the user to interact with a Graphical User Interface (GUI) 306 of the UE, wherein the user interacts with the GUI 306 to at least one of, zoom in to the specific time period of the data point, hover over data points and toggle visibility of the one or more parameters including actual, forecasted and predicted, wherein the one or more processors (202) is configured to perform the steps of claim 1.

Documents

Application Documents

# Name Date
1 202321067265-STATEMENT OF UNDERTAKING (FORM 3) [06-10-2023(online)].pdf 2023-10-06
2 202321067265-PROVISIONAL SPECIFICATION [06-10-2023(online)].pdf 2023-10-06
3 202321067265-FORM 1 [06-10-2023(online)].pdf 2023-10-06
4 202321067265-FIGURE OF ABSTRACT [06-10-2023(online)].pdf 2023-10-06
5 202321067265-DRAWINGS [06-10-2023(online)].pdf 2023-10-06
6 202321067265-DECLARATION OF INVENTORSHIP (FORM 5) [06-10-2023(online)].pdf 2023-10-06
7 202321067265-FORM-26 [27-11-2023(online)].pdf 2023-11-27
8 202321067265-Proof of Right [12-02-2024(online)].pdf 2024-02-12
9 202321067265-DRAWING [07-10-2024(online)].pdf 2024-10-07
10 202321067265-COMPLETE SPECIFICATION [07-10-2024(online)].pdf 2024-10-07
11 Abstract.jpg 2024-12-30
12 202321067265-Power of Attorney [24-01-2025(online)].pdf 2025-01-24
13 202321067265-Form 1 (Submitted on date of filing) [24-01-2025(online)].pdf 2025-01-24
14 202321067265-Covering Letter [24-01-2025(online)].pdf 2025-01-24
15 202321067265-CERTIFIED COPIES TRANSMISSION TO IB [24-01-2025(online)].pdf 2025-01-24
16 202321067265-FORM 3 [31-01-2025(online)].pdf 2025-01-31