Abstract: The present disclosure relates to a system (108) and a method (400) for forecasting network expansion requirements. The system (108) receives one or more input parameter values from a set of network entities (112) associated with a network (106), where each of the network entities (112) includes a capacity value associated therewith and the set of network entities being associated with a geographical area. The system (108) forecasts an expected load value for the set of network entities (112) of the network (106) based on the one or more input parameter values. The system (108) determines whether the expected load value exceeds a predetermined load threshold, where the network may be expanded when the expected load value exceeds the predetermined load threshold. The predetermined load threshold is a function of the capacity value associated with each of the network entities (112). Figure.3
FORM 2
THE PATENTS ACT, 1970 (39 of 1970) THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
APPLICANT
of Office-101, Saffron, Nr JIO PLATFOMS LIMITED , Ambawadi, Ahmedabad -
380006, Gujarat, India; Nationality : India
The following specification particularly describes
the invention and the manner in which
it is to be performed
RESERVATION OF RIGHTS
[0001] A portion of the disclosure of this patent document contains material, which is subject to intellectual property rights such as, but are not limited to, copyright, design, trademark, Integrated Circuit (IC) layout design, and/or trade dress protection, belonging to Jio Platforms Limited (JPL) or its affiliates (hereinafter referred as owner). The owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all rights whatsoever. All rights to such intellectual property are fully reserved by the owner.
FIELD OF DISCLOSURE
[0002] The embodiments of the present disclosure generally relate to
communication networks. In particular, the present disclosure relates to a system and method for forecasting network expansion.
BACKGROUND OF DISCLOSURE
[0003] The following description of related art is intended to provide
background information pertaining to the field of the disclosure. This section may include certain aspects of the art that may be related to various features of the present disclosure. However, it should be appreciated that this section be used only to enhance the understanding of the reader with respect to the present disclosure, and not as admissions of prior art.
[0004] Network expansion involves the installation of additional base
stations, such as cellular towers, that provide enhanced and extended cellular network services. For instance, additional base stations are installed in a geographical when network traffic in said region exceeds their capacity. Requirements for network expansion are often difficult to foresee, and are usually performed only when performance noticeable degrades or end user raise complaints for ineffective service. Delayed network expansion can be costly, time consuming, insufficiently optimized and often causes significant inconvenience to users.
[0005] Since network expansion requirements depend on a plurality of
factors, which are often context dependent, existing automated solutions struggle with analyzing such context dependent factors and forecasting network requirement with acceptable accuracy.
[0006] There is, therefore, a need in the art to provide a method and a system
that can overcome the shortcomings of the existing prior arts.
OBJECTS OF THE PRESENT DISCLOSURE
[0007] Some of the objects of the present disclosure, which at least one
embodiment herein satisfies are as listed herein below.
[0008] An object of the present disclosure is to provide a system and a
method for forecasting network expansion.
[0009] Another object of the present disclosure is to provide a system and a
method that reduces time and cost of setting up base stations.
[0010] Another object of the present disclosure is to provide a system and a
method that predicts network expansion requirements.
[0011] Another object of the present disclosure is to provide a system and a
method that dynamically selects and retrains machine learning models used for forecasting network expansion requirements.
[0012] Another object of the present disclosure is to provide a system and a
method that allows for proactive installation of base stations based on forecasted load.
SUMMARY OF THE PRESENT DISCLOSURE
[0013] The present disclosure discloses a system for forecasting network
expansion requirements. The system includes a processor, a data acquisition engine coupled to the processor. The data acquisition engine is configured to receive one
or more input parameter values from a set of networking entities. In an example, an input parameter value is indicative of call records and performance data of a cell site of a network. Further, the system includes an Artificial Intelligence (AI) engine coupled to the processor. The AI engine is configured to forecast expected load value for the set of networking entities based on the one or more input parameter values. The expected load value is indicative of a load to be experienced by the network entities in a predefined time The system further includes a determination engine coupled to the processor. The determination engine is configured to determine whether the expected load value exceeds a predetermined load threshold. The predetermined load threshold is a function of a capacity value associated with each of the network entities. In addition, the determination engine is configured to generate a recommendation indicating expansion of the network, upon determination.
[0014] In an embodiment, the network entities at least comprise base
stations, routers, access points, network switches, mobility management entity, access and mobility management function (AMF) unit, Session Management Function (SMF) unit, Self-Organizing Networks (SON) server and/or Radio Network Controller (RNC) etc.
[0015] In an embodiment, the recommendation for expansion of the
network comprises a recommendation for at least one of a vertical expansion of the network, a horizontal expansion of the network, an upgradation of operational units of the set of base stations, an upgradation of one or more network entities associated with the network, installing an update to software components, replacing existing hardware components with those of higher specifications or configurations, installation of additional network entities in the network.
[0016] In an embodiment, to forecast the expected load value, the AI engine
is configured to dynamically select one or more pretrained machine learning (ML) models having an evaluation metric above an evaluation threshold. The evaluation threshold is indicative of any evaluation metric for evaluating performance of
output generated by the selected ML model.
[0017] In an embodiment, the determination engine is further configured to
transmit a set of signals to a monitoring unit for providing audio-visual indications that a set or subset of networking entities are forecasted to exceed the capacity values associated therewith.
[0018] In an embodiment, the determination engine is also configured to
trigger execution of one or more auto-scaling instructions for upgrading the specifications or configurations of the networking entities.
[0019] In an embodiment, the prior to forecasting, the AI engine is
configured to train one or more ML models for forecasting expected load value. Further, the AI engine is to select one of the trained ML models based on a predefined evaluation metric.
[0020] In an embodiment, the AI engine is configured to retrain the ML
model periodically based on the one or more input parameter values.
[0021] The present disclosure also discloses a method for forecasting
network expansion requirements. The method includes receiving one or more input parameter values from a set of networking entities. The input parameter value is indicative of call records and performance data of a cell site of a network. The method further includes forecasting expected load value for the set of networking entities based on the one or more input parameter values. The expected load value is indicative of a load or traffic the networking entities are expected to experience within a predefined time range in the future. In addition, the method includes determining whether the expected load value exceeds a predetermined load threshold. The predetermined load threshold is a function of a capacity value associated with each of the network entities. In addition, the method includes generating a recommendation indicating expansion of the network, upon determination.
[0022] In an embodiment, the recommendation for expansion of the
network comprises a recommendation for at least one of a vertical expansion of the network, a horizontal expansion of the network, an upgradation of operational units of the set of base stations, an upgradation of one or more network entities associated with the network, installing an update to software components, replacing existing hardware components with those of higher specifications or configurations, installation of additional network entities in the network. The vertical expansion refers to enabling the network to provide more services to a same user. As an expansion, if the network provides a voice service to the user but does not provide video or data services, then vertical expanded network is able to provide video and data services to the user. The horizontal expansion of the network refers to enabling the network to cover more area so that it can provide the same service to more number of customers. For example, if the network is initially able to provide a service in an area 100m2 to 10000 customers, then the horizontally expanded network may be able to provide the services in an area of 200 m2 to 20000 customers. The upgradation of one or more network entities may include upgrading the processing capacity such as increasing the number of processors, servers etc to process more number of requests. The upgradation of operational units of the set of base stations refers to the increase in the number of base stations.
[0023] In an embodiment, forecasting the expected load value further
comprises dynamically selecting one or more pretrained machine learning (ML) models having an evaluation metric above an evaluation threshold. The evaluation threshold is indicative of any evaluation metric for evaluating performance of output generated by the ML model.
[0024] In an embodiment, the method further comprises transmitting a set
of signals to a monitoring unit for providing audio-visual indications that a set or subset of networking entities are forecasted to exceed the capacity values associated therewith.
[0025] In an embodiment, the method further comprises triggering
execution of one or more auto-scaling instructions for upgrading the specifications or configurations of the networking entities.
[0026] In an embodiment, prior to forecasting, the method comprises
training one or more ML models for forecasting expected load value and selecting 5 one of the trained ML models based on a predefined evaluation metric.
[0027] In an embodiment, the method further comprises retraining the ML
model periodically based on the one or more input parameter values.
[0028] A user equipment (UE) for forecasting network expansion
requirements, the user equipment (UE) comprises a processor, a data acquisition
10 engine, coupled to the processor , an Artificial Intelligence (AI) engine, The data acquisition engine ( is configured to receive one or more input parameter values from a set of networking entities. The input parameter value is indicative of call records and performance data of a cell site of a network. The Artificial Intelligence (AI) engine is coupled to the processor and is configured to forecast expected load
15 value for the set of networking entities based on the one or more input parameter values. The expected load value is indicative of a load or traffic the networking entities are expected to experience within a predefined time range in the future. A determination engine which is coupled to the processor to determine whether the expected load value exceeds a predetermined load threshold. The predetermined
20 load threshold is a function of a capacity value associated with each of the network entities. Based upon the determination, a recommendation is generated indicating an expansion of the network.
[0029] The present disclosure discloses a computer program product
comprising a non-transitory computer-readable medium comprising instructions 25 that, when executed by one or more processors, cause the one or more processors to receive one or more input parameter values from a set of networking entities. An input parameter value is indicative of call records and performance data of a cell site of a network. The instructions further cause the one or more processors to forecast expected load value for the set of networking entities based on the one or
6
more input parameter values. The expected load value is indicative of a load or traffic the networking entities are expected to experience within a predefined time range in the future. The instructions further cause the one or more processors to determine whether the expected load value exceeds a predetermined load threshold. 5 The predetermined load threshold is a function of a capacity value associated with each of the network entities. The instructions also cause the one or more processors to generate a recommendation indicating expansion of the network, upon determination.
10 BRIEF DESCRIPTION OF DRAWINGS
[0030] The accompanying drawings, which are incorporated herein, and
constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not
15 necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes the disclosure of electrical components, electronic components
20 or circuitry commonly used to implement such components.
[0031] FIG. 1 illustrates an exemplary architecture (100) for forecasting
network expansion requirements, in accordance with embodiments of the present disclosure.
[0032] FIG. 2 illustrates a block diagram (200) of the proposed system
25 (108), in accordance with embodiments of the present disclosure.
[0033] FIG. 3 illustrates an exemplary implementation (300) of the system
proposed (108), in accordance with embodiments of the present disclosure.
[0034] FIG. 4 illustrates a sequence diagram (400) for forecasting network
7
expansion requirements, in accordance with embodiments of the present disclosure.
[0035] FIG. 5 illustrates an exemplary computer system (500) in which or
with which embodiments of the present disclosure may be implemented.
[0036] The foregoing shall be more apparent from the following more
5 detailed description of the disclosure.
LIST OF REFERENCE NUMERALS
100 - Network Architecture
102-1, 102-2… 102-N - Users 10 104-1, 104-2… 104-N - User Equipments
106- Network
108 -System
110 -Monitoring Unit
112-Base Station 15 202 - Processor(s)
204 - Memory
206 -Interface(s)
208 - Processing Engine
210 - Database 20 212 - Data Acquisition Engine
214 - Artificial Intelligence (AI) Engine
216 - Determination Engine
218 - Other Units
510 - External Storage Device 25 520 - Bus
530 - Main Memory
540 - Read-Only Memory
550 - Mass Storage Device
560 - Communication Port
8
570 – Processor
DETAILED DESCRIPTION OF DISCLOSURE
[0037] In the following description, for the purposes of explanation, various
5 specific details are set forth in order to provide a thorough understanding of embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not 10 address all of the problems discussed above or might address only some of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein.
[0038] The ensuing description provides exemplary embodiments only, and
is not intended to limit the scope, applicability, or configuration of the disclosure. 15 Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the disclosure as set forth.
20 [0039] Specific details are given in the following description to provide a
thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to
25 obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
[0040] Also, it is noted that individual embodiments may be described as a
process which is depicted as a flowchart, a flow diagram, a data flow diagram, a
9
structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed but could have additional 5 steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
[0041] The word “exemplary” and/or “demonstrative” is used herein to
10 mean serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “exemplary” and/or “demonstrative” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques 15 known to those of ordinary skill in the art. Furthermore, to the extent that the terms “includes,” “has,” “contains,” and other similar words are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term “comprising” as an open transition word without precluding any additional or other elements.
20 [0042] Reference throughout this specification to “one embodiment” or “an
embodiment” or “an instance” or “one instance” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout
25 this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0043] The terminology used herein is for the purpose of describing
particular embodiments only and is not intended to be limiting of the disclosure. As
10
used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, 5 elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
[0044] In an aspect, the present disclosure relates to a system and a method
10 for forecasting network expansion requirements. The system receives one or more input parameter values from a set of base stations communicatively connected via a network, where each base station in the set of base station includes a capacity value associated therewith and the set of base stations being associated with a geographical area. The system forecasts an expected load value for the set of base 15 stations based on the one or more input parameter values. The system determines whether the expected load value exceeds a predetermined load threshold, where the network of the set of base stations may be expanded when the expected load value exceeds the predetermined load threshold. The predetermined load threshold is a function of the capacity value associated with each of the base stations.
20 [0045] The various embodiments throughout the disclosure will be
explained in more detail with reference to FIGs. 1-5.
[0046] Referring to FIG. 1, the network architecture (100) may include one
or more computing devices or user equipments (104-1, 104-2…104-N) associated with one or more users (102-1, 102-2…102-N) in an environment. A person of 25 ordinary skill in the art will understand that one or more users (102-1, 102-2…102-N) may be individually referred to as the user (102) and collectively referred to as the users (102). Similarly, a person of ordinary skill in the art will understand that one or more user equipments (104-1, 104-2…104-N) may be individually referred to as the user equipment (104) and collectively referred to as the user equipment
11
(104). A person of ordinary skill in the art will appreciate that the terms “computing device(s)” and “user equipment” may be used interchangeably throughout the disclosure. Although three user equipments (104) are depicted in FIG. 1, however any number of the user equipments (104) may be included without departing from 5 the scope of the ongoing description.
[0047] In an embodiment, the user equipment (104) may include smart
devices operating in a smart environment, for example, an Internet of Things (IoT) system. In such an embodiment, the user equipment (104) may include, but is not limited to, smart phones, smart watches, smart sensors (e.g., mechanical, thermal,
10 electrical, magnetic, etc.), networked appliances, networked peripheral devices, networked lighting system, communication devices, networked vehicle accessories, networked vehicular devices, smart accessories, tablets, smart television (TV), computers, smart security system, smart home system, other devices for monitoring or interacting with or for the users (102) and/or entities, or any combination thereof.
15 A person of ordinary skill in the art will appreciate that the user equipment (104) may include, but is not limited to, intelligent, multi-sensing, network-connected devices, that can integrate seamlessly with each other and/or with a central server or a cloud-computing system or any other device that is network-connected.
[0048] In an embodiment, the user equipment (104) may include, but is not
20 limited to, a handheld wireless communication device (e.g., a mobile phone, a smart phone, a phablet device, and so on), a wearable computer device(e.g., a head-mounted display computer device, a head-mounted camera device, a wristwatch computer device, and so on), a Global Positioning System (GPS) device, a laptop computer, a tablet computer, or another type of portable computer, a media playing 25 device, a portable gaming system, and/or any other type of computer device with wireless communication capabilities, and the like. In an embodiment, the user equipment (104) may include, but is not limited to, any electrical, electronic, electro-mechanical, or an equipment, or a combination of one or more of the above devices such as virtual reality (VR) devices, augmented reality (AR) devices, 30 laptop, a general-purpose computer, desktop, personal digital assistant, tablet
12
computer, mainframe computer, or any other computing device, wherein the user equipment (104) may include one or more in-built or externally coupled accessories including, but not limited to, a visual aid device such as a camera, an audio aid, a microphone, a keyboard, and input devices for receiving input from the user (102) 5 or the entity such as touch pad, touch enabled screen, electronic pen, and the like. A person of ordinary skill in the art will appreciate that the user equipment (104) may not be restricted to the mentioned devices and various other devices may be used. The architecture (100) may include a monitoring unit (110) having a user interface that provides audio-visual indications to the user based on a set of signals 10 transmitted by the system (108). In an embodiment, the monitoring unit (110) may be implemented on a UE (104) and may be used by operators of the system (108).
[0049] Referring to FIG. 1, the user equipment (104) may communicate
with a system (108) via a network (106). In an embodiment, the network (106) may include at least one of a Fifth Generation (5G) network, 6G network, or the like.
15 The network (106) may enable the user equipment (104) to communicate with other devices in the network architecture (100) and/or with the system (108). The network (106) may include a wireless card or some other transceiver connection to facilitate this communication. In another embodiment, the network (106) may be implemented as, or include any of a variety of different communication
20 technologies such as a wide area network (WAN), a local area network (LAN), a wireless network, a mobile network, a Virtual Private Network (VPN), the Internet, the Public Switched Telephone Network (PSTN), or the like. In an embodiment, the network (106) may include one or more network entities such as base stations (112) for facilitating communication between the one or more UEs (104). The
25 network (106) may be formed by a set of base stations (112) communicatively coupled to enable telecommunication exchanges between one or more UEs (104).
[0050] The base station (112) may be a network infrastructure that provides
wireless access to one or more terminals associated therewith. The base station may
have coverage defined to be a predetermined geographic area based on the distance
30 over which a signal may be transmitted. The base station may include, but not be
13
limited to, wireless access point, evolved NodeB (eNodeB), 5G node or next generation NodeB (gNB), wireless point, transmission/reception point (TRP), and the like. In an embodiment, the base station (112) may include one or more operational units that enable telecommunication between two or more UEs (104). 5 In an embodiment, the one or more operational units may include, but not be limited to, transceivers, baseband unit (BBU), (remote radio unit - RRU), antennae, mobile switching centres, radio network control units, one or more processors associated thereto. Each base station in the set of base station includes a capacity value associated therewith and the set of base stations (112) may be associated with a
10 geographical area. The capacity value may refer to a measure or metric associated with each network entity that represents the entity’s ability to handle or process network load. The capacity value may be used to determine the network entity’s contribution to the overall system capacity and may influence network expansion decisions. In an embodiment, the geographical region may indicate the signal
15 coverage of the set of base stations (112).
[0051] In an embodiment, the network (106) may have one or more network
entities. The network entities (112) may refer to devices or systems within a network infrastructure that generate, process, or store data related to network operations. The network entities may include, but not limited to router, access point,
20 network node, base stations, network switches, data centers Mobility Management Entity (MME), Serving Gateway (S-GW), Packet Data Network (PDN) Gateway (P-GW), Self-Organizing Networks (SON) server and/or Radio Network Controller (RNC), and the like. In embodiments where the network (106) is a 5G network, the network (106) may further include a plurality of network entities such as Access
25 and Mobility Management Function (AMF) unit, Session Management Function (SMF) unit, Network Exposure Function (NEF) units, or any custom-built functions that executing one or more processor-executable instructions, but not limited thereto. In an embodiment, the network entities forming the core network (106) may be implemented as a hardware component, software component, or any
30 combination thereof.
14
[0052] Network expansion requirements may include vertical expansion or
horizontal expansion of the network (106). In an embodiment, vertical expansion of the network (106) may refer to upgrading or enhancing existing network infrastructure to increase capacity, improve performance, or add capabilities within 5 the same geographical area. On the other hand, horizontal expansion of the network (106) may refer to the process of extending the network’s reach and coverage area. In an embodiment, network expansion requirements may include upgradation of the operational units of the set of base stations (112) or one or more network entities associated with the network (106). In such embodiments, network expansion 10 requirement may involve installing an update to software components or replacing existing hardware components with those of higher specifications or configurations. In other embodiments, network expansion requirements may include installation of additional network entities in the network. The network (106) may be expanded based on network expansion requirements forecasted by the system (108).
15 [0053] In accordance with embodiments of the present disclosure, the
system (108) may be designed and configured for forecasting network expansion requirements. In an embodiment, the system (108) may be configured to predict when a set or subset of base stations (112) may require upgradation or expansion.
[0054] FIG. 2 illustrates a block diagram (200) of the proposed system
20 (108), in accordance with embodiments of the present disclosure.
[0055] In an aspect, the system (108) may include one or more processor(s)
(202). The one or more processor(s) (202) may be implemented as one or more microprocessors, microcomputers, microcontrollers, edge or fog microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any 25 devices that process data based on operational instructions. Among other capabilities, the one or more processor(s) (202) may be configured to fetch and execute computer-readable instructions stored in a memory (204) of the system (108). The memory (204) may be configured to store one or more computer-readable instructions or routines in a non-transitory computer readable storage
15
medium, which may be fetched and executed to create or share data packets over a network service. The memory (204) may comprise any non-transitory storage device including, for example, volatile memory such as Random Access Memory (RAM), or non-volatile memory such as Erasable Programmable Read-Only 5 Memory (EPROM), flash memory, and the like.
[0056] Referring to FIG. 2, the system (108) may include an interface(s)
(206). The interface(s) (206) may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) (206) may facilitate communication to/from 10 the system (108). The interface(s) (206) may also provide a communication pathway for one or more components of the system (108). Examples of such components include, but are not limited to, processing unit/engine(s) (208) and a database (210).
[0057] In an embodiment, the processing unit/engine(s) (208) may be
15 implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) (208). In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) (208) may be processor-20 executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) (208) may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the 25 processing engine(s) (208). In such examples, the system (108) may include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the system (108) and the processing resource. In other examples, the processing engine(s) (208) may be implemented by electronic 30 circuitry.
16
[0058] In an embodiment, the database (210) includes data that may be
either stored or generated as a result of functionalities implemented by any of the components of the processor (202) or the processing engines (208). In an embodiment, the database (210) may be separate from the system (108). In an 5 embodiment, the database (210) may be indicative of including, but not limited to, a relational database, a distributed database, a cloud-based database, or the like.
[0059] In an exemplary embodiment, the processing engine (208) may
include one or more engines selected from any of a data acquisition engine (212), an artificial intelligence (AI) engine (214), a determination engine (216) and other 10 engines (218) having functions that may include, but are not limited to, testing, storage, and peripheral functions, such as wireless communication unit for remote operation, audio unit for alerts and the like.
[0060] In an embodiment, the data acquisition engine (212) may be
configured to receive the one or more input parameter values from the networking
15 entities. In an embodiment, the one or more input parameter values may include, but not be limited to, the CDR, the one or more performance metric values, and the like. In an embodiment, the CDR may include, but not be limited to, start time, duration, end time, time to connection, success of the call, failures of the call, parties to the call, location of parties, unique identifiers of devices of the parties, unique
20 identifiers associated with one or more operational units of the base stations, and other details associated with a call, text messages or any telecommunication exchanges between two or more entities over the network (106). In an embodiment, the one or more CDR data may be collected by the set of base stations (112) or networking entities in the network (106) when the UEs (104) engage in such
25 telecommunication exchange. In an embodiment, telecommunication exchanges may include, but not be limited to, text messages, phone calls, multimedia messaging, and the like. In an embodiment, the one or more performance metric values may be indicative of including, but not limited to, Customer Success Ratios (CSR), transmittal speeds, capacity, latching rate, and the like.
17
[0061] In an embodiment, the AI engine (214) may be configured to forecast
the expected load value for the set of base stations based on the one or more input parameter values. In an embodiment, the AI engine (214) may be indicative of a pretrained machine learning model, expert system, or the like, but not limited to the 5 same, that uses the one or more input parameter values to output the forecasted or expected load value. In an embodiment, the AI engine (214) may be configured to dynamically select one or more pretrained machine learning (ML) models having an evaluation metric above an evaluation threshold.
[0062] A pre-trained model is a machine learning (ML) model that has been
10 trained on a large dataset and can be fine-tuned for a specific task. Pre-trained models are often used as a starting point for developing ML models, as they provide a set of initial weights and biases that can be fine-tuned for a specific task. Pre-trained models are often trained on large, diverse datasets and have been trained to recognize a wide range of patterns and features. As a result, they can provide a 15 strong foundation for fine-tuning and can significantly improve the performance of the model.
[0063] Pre-trained models come in a variety of forms, such as language
models, object detection models, and picture classification models. Convolutional neural networks are frequently used as the foundation for image classification 20 models, which are trained to categorize images into predetermined categories (CNNs).
[0064] In an embodiment, the evaluation threshold may include, but not be
limited to, mean square loss, root mean square loss, accuracy, precision, recall, or any other custom evaluation metric for evaluating performance of output generated 25 by the ML model. In an embodiment, the expected load value may be indicative of the load or traffic the networking entities are expected to experience within a predefined time range in the future. For example, the expected load value may be 60%, 70%, 75% etc of maximum load experienced by a network entity such as base station. If the base station is capable to serve traffic of maximum 10000 users, then
18
expected load value of 70% in predefined time range such as within 2-3 days indicates that, the 7000 users are expected to avail services from the base station in coming 2-3 days. The expected load values may be predefined based on different input parameter values. A lookup table may be created which includes the values 5 of the input parameters and corresponding expected load values. The expected load values in the table may be defined as a range. For example for a value A of a parameter, the expected load value may be defined in range as 40-50% in the table.
[0065] In an embodiment, the determination engine (216) may determine
whether the expected load value exceeds a predetermined load threshold. In an
10 embodiment, the network (106) may be expanded when the expected load value exceeds the predetermined load threshold. The predetermined load threshold may be a function of the capacity value associated with each of the base stations. In an embodiment, the predetermined load threshold may be substantially equivalent to about 90%. Based on the determination, the determination engine (216) may
15 generate a recommendation indicating expansion of the network (106).
[0066] In an embodiment, the determination engine (216) may transmit a
set of signals to the monitoring unit (110) that provides audio-visual indications that a set or subset of networking entities are forecasted to increase the capacity values associated therewith. In such embodiments, manual interventions may be made to 20 expand the network (106). In other embodiments, the determination engine (216) may trigger execution of one or more auto-scaling processor executable instructions for upgrading the specifications or configurations of the networking entities.
[0067] FIG. 3 illustrates an exemplary implementation (300) of the system
proposed (108), in accordance with embodiments of the present disclosure.
25 [0068] In an embodiment, the AI engine (214) may be pretrained using
historical data of the one or more input parameters stored in the database (210). In an embodiment, the AI engine (214) may visualize and analyze the data for preprocessing, where preprocessing may include, but not be limited to, imputing data, removing missing values, tokenization, scaling, splitting data into train and
19
test sets, and the like. In an embodiment, the AI engine (214) may train one or more ML models for forecasting expected load value. In an embodiment, the AI engine (214) may select one of the trained ML models based on a predefined evaluation metric. In an embodiment, the ML model may be retrained periodical as the data 5 acquisition engine (212) collects and stores one or more input parameters in the database (210). As shown, the one or more input parameter values may be continuously generated by the one or more UEs (104) connecting and interacting with the network (106).
[0069] In an embodiment, the AI engine (214) may predict the load capacity
10 of the network as indicated by the block (302). Based on the predicted load capacity, the AI engine (214) may generate a forecasted or expected load value for the set of networking entities based on the one or more input parameter values. The system (108) further determines at step (302) whether the expected load value exceeds a predetermined load threshold, based on which the network (106) may be expanded 15 or contracted. In an example, if the expected load value exceeds 90% of the capacity value associated with the set of base stations (112), the operators (304) of the system (108) may expand the network (106) in anticipation of the increased load. The network team may install new sites i.e. base stations (306) based on the expected load value.
20 [0070] In an embodiment, the AI engine (214) may transmit a set of signals
to the monitoring unit (110) such that the monitoring unit (110) provides audio-visual provides audio-visual indications that a set or subset of networking entities are forecasted to increase the capacity values associated therewith. In such embodiments, manual interventions may be made to expand the network (106). In
25 an embodiment, on providing the audio-visual indications, the operators of the network may form network expansion plans and install more, and/or upgrade existing, networking entities. In other embodiments, the system (108) may trigger execution of one or more auto-scaling processor executable instructions for upgrading the specifications or configurations of the networking entities.
20
[0071] FIG. 4 illustrates a flowchart (400) for forecasting network
expansion requirements, in accordance with embodiments of the present disclosure.
[0072] At step (402), the method (400) may include providing a set of
network entities associated with a network, such as base station (112) and network 5 (106) of FIG. 3 and FIG. 1 respectively, where each networking entity includes a capacity value associated therewith.
[0073] At step (404), the method (400) includes receiving, by a processor
such as the processor (202) of FIG. 2, one or more input parameter values from a set of network entities such as base stations communicatively connected via a 10 network, such as the network (106), the set of network entities being associated with a geographical area. The input parameter value is indicative of call records and performance data of a cell site of the network.
[0074] At step (406), the method (400) includes forecasting, by the
processor, an expected load value for the set of network entities based on the one
15 or more input parameter values. The expected load value is indicative of a load or traffic the networking entities (112) are expected to experience within a predefined time range in the future. In an example, forecasting the expected load value comprises dynamically selecting one or more pretrained machine learning (ML) models having an evaluation metric above an evaluation threshold. The evaluation
20 threshold is indicative of any evaluation metric for evaluating performance of output generated by the ML model. The evaluation metrics are quantitative measures used to assess the performance and effectiveness of a statistical or machine learning model. These metrics provide insights into how well the model is performing and help in comparing different models or algorithms. Accuracy,
25 efficiency, precision, time consuming, complexity etc are examples of the evaluation metrics. The accuracy indicates a proportion of correct predictions among the total number of cases examined and precision indicates the ratio of true positive predictions to the total number of positive predictions. The evaluation threshold represents a minimum or maximum values of an evaluation metrics
21
defined for an ML model. For example, the accuracy of a ML model can be in 20-40% which that the specific model can provide maximum of 40% accurate output. Thus, if the evaluation threshold for accuracy defined by the user is 60%, then a pretrained machine learning (ML) model which can provide more than 60% 5 accuracy will selected. If there are more number of models which can provide accuracy of more than 60%, then a model providing the highest accuracy is selected.
[0075] At step (408), the method (400) includes determining, by the
processor, whether the expected load value exceeds a predetermined load threshold. In an embodiment, the predetermined load threshold is a function of the capacity 10 value associated with each of the network entities.
[0076] The method (400) further includes upon determination, generating a
recommendation indicating expansion of the network (106).
[0077] The method (400) also includes transmitting a set of signals to a
monitoring unit (110) for providing audio-visual indications that a set or subset of 15 networking entities (112) are forecasted to increase the capacity values associated therewith.
[0078] In accordance with embodiments of the present disclosure, initially
a user equipment latches to the nearest cell site and register into the network to avail required services. Load capacity of a given cell is calculated on timely basis and is
20 notified to the Network Operations Team for load analysis. Call records and performance data of a given cell is extracted from the site, cleaned and normalized. Pre-processed data is then fed to train and finalize the best fit Machine Learning model. The optimized model is used to predict the near-future load traffic of each cell. The forecasted load data is visualized to check whether it exceeds 90% of the
25 load capacity. The network planning team is notified in advance about new site installation whenever the forecast values are more than 90%.
[0079] The method (400) further includes triggering execution of one or
more auto-scaling instructions for upgrading the specifications or configurations of
22
the networking entities (112).
[0080] A user equipment (UE) (102) for forecasting network expansion
requirements, the user equipment (UE) (102) comprises a processor (202), a data acquisition engine (212), coupled to the processor (202), an Artificial Intelligence 5 (AI) engine (214), The data acquisition engine (212) is configured to receive one or more input parameter values from a set of networking entities (112). The input parameter value is indicative of call records and performance data of a cell site of a network (106). The Artificial Intelligence (AI) engine (214) is coupled to the processor (202 and is configured to forecast expected load value for the set of
10 networking entities (112) based on the one or more input parameter values. The expected load value is indicative of a load or traffic the networking entities (112) are expected to experience within a predefined time range in the future. A determination engine (216) which is coupled to the processor (202) to determine whether the expected load value exceeds a predetermined load threshold. The
15 predetermined load threshold is a function of a capacity value associated with each of the network entities (112). Based upon the determination, a recommendation is generated indicating an expansion of the network (106).
[0081] FIG. 5 illustrates an exemplary computer system (500) in which or
with which embodiments of the present disclosure may be implemented. As shown
20 in FIG. 5, the computer system (500) may include an external storage device (510), a bus (520), a main memory (530), a read only memory (540), a mass storage device (550), a communication port (560), and a processor (570). A person skilled in the art will appreciate that the computer system (500) may include more than one processor (570) and communication ports (560). Processor (570) may include
25 various modules associated with embodiments of the present disclosure.
[0082] In an embodiment, the communication port (560) may be any of an
RS-232 port for use with a modem-based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or future ports. The communication port (560) may be chosen
23
depending on a network, such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which the computer system (500) connects.
[0083] In an embodiment, the memory (530) may be Random Access
Memory (RAM), or any other dynamic storage device commonly known in the art. 5 Read-only memory (540) may be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or Basic Input/Output System (BIOS) instructions for the processor (570).
[0084] In an embodiment, the mass storage (550) may be any current or
10 future mass storage solution, which may be used to store information and/or instructions. Exemplary mass storage solutions include, but are not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial Bus (USB) and/or Firewire interfaces), one 15 or more optical discs, Redundant Array of Independent Disks (RAID) storage, e.g., an array of disks (e.g., SATA arrays).
[0085] In an embodiment, the bus (520) communicatively couples the
processor(s) (570) with the other memory, storage and communication blocks. The bus (520) may be, e.g., a Peripheral Component Interconnect (PCI)/PCI Extended 20 (PCI-X) bus, Small Computer System Interface (SCSI), Universal Serial Bus (USB) or the like, for connecting expansion cards, drives and other subsystems as well as other buses, such a front side bus (FSB), which connects the processor (570) to the computer system (500).
[0086] Optionally, operator and administrative interfaces, e.g., a display,
25 keyboard, joystick, and a cursor control device, may also be coupled to the bus (520) to support direct operator interaction with the computer system (500). Other operator and administrative interfaces may be provided through network connections connected through the communication port (560). Components described above are meant only to exemplify various possibilities. In no way should
24
the aforementioned exemplary computer system (500) limit the scope of the present disclosure.
[0087] Accordingly, by anticipating the need for new site and proactively
performing the installation, the present disclosure facilitates in reducing the time 5 and resource consumption associated with installation of cell tower and completes in a short notice. As a result, the present disclosure ensures seamless end user experience.
[0088] The present disclosure provides technical advancement related to use
of machine learning technologies to accurately forecast the traffic load of each cell 10 site for the near future, thus allowing network planning team to act in prompt and precise way. Since installation of a new cell site requires significant time and resource and is difficult to install it in a short notice. The use of AI/ML anticipates the need of new site a prior and the network team perform proactive installation, thus ensuring seamless end user experience.
15 [0089] While considerable emphasis has been placed herein on the preferred
embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the disclosure. These and other changes in the preferred embodiments of the disclosure will be apparent to those skilled in the art from the
20 disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter to be implemented merely as illustrative of the disclosure and not as limitation.
ADVANTAGES OF THE PRESENT DISCLOSURE
25 [0090] The present disclosure provides a system and a method for
forecasting network expansion.
[0091] The present disclosure provides a system and a method that reduces
time and cost of setting up base stations.
25
[0092] The present disclosure provides a system and a method that predicts
network expansion requirements.
[0093] The present disclosure provides a system and a method that
dynamically selects and retrains machine learning models used for forecasting 5 network expansion requirements.
[0094] The present disclosure provides a system and a method that allows
for proactive installation of base stations based on forecasted load.
26
We claim:
1. A system (108) for forecasting network expansion requirements, the system (108)
comprising:
a processor (202);
a data acquisition engine (212), coupled to the processor (202), the data acquisition engine (212) is configured to:
receive one or more input parameter values from a set of networking entities (112), wherein an input parameter value is indicative of call records and performance data of a cell site of a network (106); an Artificial Intelligence (AI) engine (214), coupled to the processor (202), the AI engine (214) is configured to:
forecast an expected load value for the set of networking entities (112) based on the one or more input parameter values, the expected load value is indicative of a load to be experienced within a predefined time ; a determination engine (216), coupled to the processor (202), the determination engine (216) is configured to:
determine whether the expected load value exceeds a predetermined load threshold, wherein the predetermined load threshold is a function of a capacity value associated with each of the network entities (112); and
upon determination, generate a recommendation indicating an expansion of the network (106).
2. The system (108) as claimed in claim 1, wherein the network entities (112) at
least comprise one or more of base stations, access points, network nodes, servers,
routers, switches.
3. The system (108) as claimed in claim 2, wherein the recommendation for
expansion of the network (106) comprises a recommendation for at least one of:
a vertical expansion of the network (106);
a horizontal expansion of the network (106);
an upgradation of operational units of the set of base stations (112);
an upgradation of one or more network entities (112) associated with the network (106);
installing an update to software components; and
replacing existing hardware components with those of higher specifications or configurations; or
installation of additional network entities (112) in the network (106).
4. The system (108) as claimed in claim 1, wherein to forecast the expected load
value, the AI engine is configured to:
dynamically select one or more pretrained machine learning (ML) models having an evaluation metric above an evaluation threshold, wherein the evaluation threshold is indicative of any evaluation metric for evaluating performance of an output generated by the selected one or more pretrained ML model.
5. The system (108) as claimed in claim 1, wherein the determination engine (216)
is further configured to:
transmit a set of signals to a monitoring unit (110) for providing audio-visual indications that a set or subset of network entities (112) are forecasted to exceed the capacity values.
6. The system (108) as claimed in claim 1, wherein the determination engine (216)
is further configured to:
trigger execution of one or more auto-scaling instructions for upgrading the specifications or configurations of the network entities (112).
7. The system as claimed in claim 1, wherein prior to forecasting, the AI engine
(214) is configured to:
train one or more ML models for forecasting expected load value; and select one of the trained ML models based on a predefined evaluation metric.
8. The system (108) as claimed in claim 1, wherein the AI engine (214) is
configured to:
retrain the ML model periodically based on the one or more input parameter values.
9. A method (400) for forecasting network expansion requirements, the method
comprising:
receiving (404), by a data acquisition engine (212), one or more input parameter values from a set of networking entities (112), wherein an input parameter value is indicative of call records and performance data of a cell site of a network (106);
forecasting (406), by an Artificial Intelligence (AI) engine (214), an expected load value for the set of networking entities (112) based on the one or more input parameter values, the expected load value is indicative of a load expected to be experienced within a predefined time;
determining (408), by a determination engine (216), whether the expected load value exceeds a predetermined load threshold, wherein the predetermined load threshold is a function of a capacity value associated with each of the network entities (112); and
upon determination, generating, by the determination engine (216), a recommendation indicating expansion of the network (106).
10. The method (400) as claimed in claim 9, wherein the recommendation for
expansion of the network (106) comprises a recommendation for at least one of:
a vertical expansion of the network (106); a horizontal expansion of the network (106); an upgradation of operational units of the set of base stations (112); an upgradation of one or more network entities (112) associated with the network (106);
installing an update to software components; and
replacing existing hardware components with those of higher specifications or configurations; or
installation of additional network entities (112) in the network (106).
11. The method (400) as claimed in claim 9, wherein forecasting the expected load
value further comprises:
dynamically selecting, by the AI engine (214), one or more pretrained machine learning (ML) models having an evaluation metric above an evaluation threshold, wherein the evaluation threshold is indicative of any evaluation metric for evaluating performance of output generated by the selected one or more pretrained ML model.
12. The method (400) as claimed in claim 9, wherein the method further comprises:
transmitting, by the determination engine (216), a set of signals to a monitoring unit (110) for providing audio-visual indications that a set or subset of networking entities (112) are forecasted to exceed the capacity values.
13. The method (400) as claimed in claim 9, wherein the method further comprises:
triggering, by the determination engine (216), execution of one or more auto-scaling instructions for upgrading the specifications or configurations of the network entities (112).
14. The method (400) as claimed in claim 9, wherein prior to forecasting, the
method comprises:
training, by the AI engine (214), one or more ML models for forecasting an expected load value; and
selecting, by the AI engine (214), one of the trained ML models based on a predefined evaluation metric.
15. The method (400) as claimed in claim 9, the method further comprising:
retraining, by the AI model (214), the ML model periodically based on the
one or more input parameter values.
16. A user equipment (UE) (102) for forecasting network expansion requirements, the user equipment (UE) (102) comprising:
a processor (202);
a data acquisition engine (212), coupled to the processor (202), the data acquisition engine (212) is configured to:
receive one or more input parameter values from a set of networking entities (112), wherein an input parameter value is indicative of call records and performance data of a cell site of a network (106);
an Artificial Intelligence (AI) engine (214), coupled to the processor (202), the AI engine (214) is configured to:
forecast an expected load value for the set of network entities (112) based on the one or more input parameter values, the expected load value is indicative of a load expected to be experienced within a predefined time;
a determination engine (216), coupled to the processor (202), the determination engine (216) is configured to:
determine whether the expected load value exceeds a predetermined load threshold, wherein the predetermined load threshold is a function of a capacity value associated with each of the network entities (112); and
upon determination, generate a recommendation indicating an expansion of the network (106).
| # | Name | Date |
|---|---|---|
| 1 | 202321047356-STATEMENT OF UNDERTAKING (FORM 3) [13-07-2023(online)].pdf | 2023-07-13 |
| 2 | 202321047356-PROVISIONAL SPECIFICATION [13-07-2023(online)].pdf | 2023-07-13 |
| 3 | 202321047356-FORM 1 [13-07-2023(online)].pdf | 2023-07-13 |
| 4 | 202321047356-DRAWINGS [13-07-2023(online)].pdf | 2023-07-13 |
| 5 | 202321047356-DECLARATION OF INVENTORSHIP (FORM 5) [13-07-2023(online)].pdf | 2023-07-13 |
| 6 | 202321047356-FORM-26 [13-09-2023(online)].pdf | 2023-09-13 |
| 7 | 202321047356-POA [29-05-2024(online)].pdf | 2024-05-29 |
| 8 | 202321047356-FORM 13 [29-05-2024(online)].pdf | 2024-05-29 |
| 9 | 202321047356-AMENDED DOCUMENTS [29-05-2024(online)].pdf | 2024-05-29 |
| 10 | 202321047356-Power of Attorney [04-06-2024(online)].pdf | 2024-06-04 |
| 11 | 202321047356-Covering Letter [04-06-2024(online)].pdf | 2024-06-04 |
| 12 | 202321047356-ORIGINAL UR 6(1A) FORM 26-120624.pdf | 2024-06-20 |
| 13 | 202321047356-ENDORSEMENT BY INVENTORS [10-07-2024(online)].pdf | 2024-07-10 |
| 14 | 202321047356-DRAWING [10-07-2024(online)].pdf | 2024-07-10 |
| 15 | 202321047356-CORRESPONDENCE-OTHERS [10-07-2024(online)].pdf | 2024-07-10 |
| 16 | 202321047356-COMPLETE SPECIFICATION [10-07-2024(online)].pdf | 2024-07-10 |
| 17 | Abstract-1.jpg | 2024-08-12 |
| 18 | 202321047356-FORM 18 [26-09-2024(online)].pdf | 2024-09-26 |
| 19 | 202321047356-CORRESPONDENCE(IPO)-(WIPO DAS)-01-10-2024.pdf | 2024-10-01 |
| 20 | 202321047356-FORM 3 [04-11-2024(online)].pdf | 2024-11-04 |
| 21 | 202321047356-FORM-9 [12-11-2024(online)].pdf | 2024-11-12 |
| 22 | 202321047356-FORM 18A [13-11-2024(online)].pdf | 2024-11-13 |
| 23 | 202321047356-Proof of Right [06-02-2025(online)].pdf | 2025-02-06 |
| 24 | 202321047356-ORIGINAL UR 6(1A) FORM 1-170225.pdf | 2025-02-19 |
| 25 | 202321047356-FER.pdf | 2025-05-09 |
| 26 | 202321047356-FORM 3 [15-05-2025(online)].pdf | 2025-05-15 |
| 27 | 202321047356-FORM 3 [15-05-2025(online)]-1.pdf | 2025-05-15 |
| 28 | 202321047356-OTHERS [29-05-2025(online)].pdf | 2025-05-29 |
| 29 | 202321047356-FER_SER_REPLY [29-05-2025(online)].pdf | 2025-05-29 |
| 30 | 202321047356-COMPLETE SPECIFICATION [29-05-2025(online)].pdf | 2025-05-29 |
| 31 | 202321047356-CLAIMS [29-05-2025(online)].pdf | 2025-05-29 |
| 32 | 202321047356-US(14)-HearingNotice-(HearingDate-10-11-2025).pdf | 2025-10-08 |
| 33 | 202321047356-Correspondence to notify the Controller [06-11-2025(online)].pdf | 2025-11-06 |
| 1 | 202321047356_SearchStrategyNew_E_202321047356E_07-05-2025.pdf |