Sign In to Follow Application
View All Documents & Correspondence

Systems And Methods For Optimization Of Radio Nodes By Sequential Reinforcement Neural Network (Srnn) Technique

Abstract: Systems and methods for optimization of radio nodes based upon a Sequential Reinforcement Neural Network (SRNN) technique are provided. The traditional systems and methods implementing traditional neural network models simply provide for generating an output when there is a derived relationship between input values and output labels. Embodiments of the proposed disclosure provide for correlating unrelated data entities and/or time series data generating from one or more Multi-Radio Access Technologies (Multi-RATs) nodes by implementing the SRNN technique, wherein the SRNN technique comprises obtaining a plurality of datasets; creating a neuron for each of the plurality of datasets; identifying an optimal dataset from the plurality of datasets; creating a base layer using the optimal dataset; creating a SRNN model using the base layer; identifying a plurality of optimal neurons by implementing the SRNN model; and optimizing the radio nodes using the plurality of optimal neurons identified.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 September 2018
Publication Number
10/2020
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai - 400021, Maharashtra, India

Inventors

1. KIRAN, Kotaru
Tata Consultancy Services Limited, Synergy Park Unit 1 - Phase I, Premises No.2-56/1/36, Survey No.26, Gachibowli, Serilingampally Mandal, R R District, Hyderabad - 500019, Telangana, India
2. NAGULAVANCHA, Suresh
Tata Consultancy Services Limited, Synergy Park Unit 1 - Phase I, Premises No.2-56/1/36, Survey No.26, Gachibowli, Serilingampally Mandal, R R District, Hyderabad - 500019, Telangana, India
3. DASANTHU, Shiva Kumar
Tata Consultancy Services Limited, Synergy Park Unit 1 - Phase I, Premises No.2-56/1/36, Survey No.26, Gachibowli, Serilingampally Mandal, R R District, Hyderabad - 500019, Telangana, India
4. KOTHA, Geetha
Tata Consultancy Services Limited, Synergy Park Unit 1 - Phase I, Premises No.2-56/1/36, Survey No.26, Gachibowli, Serilingampally Mandal, R R District, Hyderabad - 500019, Telangana, India
5. SAHAY, Pranay Ranjan
Tata Consultancy Services Limited, Synergy Park Unit 1 - Phase I, Premises No.2-56/1/36, Survey No.26, Gachibowli, Serilingampally Mandal, R R District, Hyderabad - 500019, Telangana, India

Specification

Claims:1. A method for optimization of radio nodes based upon a Sequential Reinforcement Neural Network (SRNN) technique, the method comprising a processor implemented steps of:
obtaining, by one or more hardware processors, a plurality of datasets comprising data records and configuration parameters corresponding to one or more Multi-Radio Access Technologies (Multi-RATs) nodes from a plurality of sources (201);
performing, based upon the plurality of datasets, a plurality of steps by implementing the SRNN technique, wherein the plurality of steps comprise (202):
(i) creating a neuron for each of the plurality of datasets, wherein the neuron comprises a record of a plurality of values and time intervals corresponding to at least one dataset amongst the plurality of datasets (202(i)); and
(ii) identifying an optimal dataset corresponding to the one or more Multi-RATs nodes, wherein the optimal dataset comprises optimal non-linked data records for creating a base layer (202(ii);
creating, from the identified optimal dataset, the base layer corresponding to a SRNN model by implementing a Manhattan technique, wherein the base layer comprises a plurality of optimal base layer neurons (203);
creating, using the base layer, the SRNN model comprising a plurality of neurons created for each of the plurality of datasets, wherein the plurality of neurons comprise the plurality of optimal base layer neurons and a plurality of non-base layer neurons (204);
identifying, using the SRNN model, a plurality of optimal neurons by implementing the Manhattan technique (205); and
optimizing the radio nodes using the plurality of optimal neurons identified, wherein the optimization is performed by feeding a set of data values identified from the plurality of optimal neurons to the one or more Multi-RATs nodes (206).

2. The method as claimed in claim 1, wherein the SRNN model facilitates correlating a set of unstructured information by implementing the SRNN technique, and wherein the set of unstructured information comprises unrelated data entries and time series data corresponding to a plurality of Multi-RATs nodes.

3. The method as claimed in claim 1, wherein the SRNN model creation comprises:
(i) selecting, by the one or more hardware processors, a dataset other than the optimal dataset from the plurality of datasets, wherein the selected dataset comprises a plurality of data records;
(ii) computing, by the SRNN technique, a distance between each of the plurality of data records in the selected dataset and each of the plurality of neurons in the SRNN model;
(iii) identifying, using the distance computed, a plurality of closest matching neurons with each of the plurality of neurons by implementing the Manhattan technique;
(iv) adding, based upon the distance computed, a child neuron to a neuron amongst the plurality of neurons, wherein the child neuron is identified from each of the plurality of closest matching neurons; and
(v) iteratively repeating steps (i) to (iv) until each of the plurality of datasets is selected to create the SRNN model.

4. The method as claimed in claim 3, wherein each of the plurality of closest matching neurons comprises a data record with a minimum distance from a neuron, wherein the data record corresponds to the plurality of data records from the selected dataset, and wherein the neuron corresponds to the plurality of neurons.

5. The method as claimed in claim 3, wherein the step of the adding the child neuron comprises:
(i) creating, based upon a time sequence of each of the plurality of data records, a time sequence link between one or more datasets amongst the plurality of datasets by implementing the SRNN technique;
(ii) creating, based upon a logical classification of the time sequence of each of the plurality of data records, a sequence link amongst a plurality of logically classified datasets, wherein each of the plurality of logically classified datasets correspond to at least one of the plurality of datasets; and
(iii) creating, based upon one or more pre-defined set of conditions, a transition link between the SRNN model and a transitional model, wherein the transitional model comprises one or more configuration parameters other than the configuration parameters of the plurality of datasets.

6. The method as claimed in claim 1, wherein the step of optimizing the radio nodes using the plurality of optimal neurons is preceded by:
(i) receiving a set of input records corresponding to the one or more Multi-RATs nodes into the SRNN model;
(ii) adding, based upon a comparison of one or more Key Performance Indicator (KPI) values and one or more threshold values, one or more input records amongst the set of input records as a neuron into the SRNN model upon determining the one or more KPI values to be less than or equal the one or more threshold values, wherein the one or more KPI values correspond to the set of input records, and wherein the one or more threshold values correspond to the Multi-RATs nodes; and
(iii) parsing, by implementing the SRNN technique, a closest matching neuron with the neuron corresponding to the one or more input records upon determining the one or more KPI values to be greater than the one or more threshold values.

7. The method as claimed in claim 1, wherein the SRNN model is traversed in a direction of a series of time sequence upon determining a non-identification of the plurality of optimal neurons by the SRNN model.

8. The method as claimed in claim 7, wherein the traversing is performed iteratively by implementing the SRNN technique until the plurality of optimal neurons are identified to optimize the radio nodes

9. The method as claimed in claim 6, wherein the step of adding the one or more input records into the SRNN model comprises parsing one or more child nodes corresponding to the closest matching neuron by either one of a forward propagation technique or a backward propagation technique.

10. A system (100) for optimization of radio nodes based upon a Sequential Reinforcement Neural Network (SRNN) technique, the system (100) comprising:
a memory (102) storing instructions;
one or more communication interfaces (106); and
one or more hardware processors (104) coupled to the memory (102) via the one or more communication interfaces (106), wherein the one or more hardware processors (104) are configured by the instructions to:
obtain a plurality of datasets comprising configuration parameters corresponding to one or more Multi-Radio Access Technologies (Multi-RATs) nodes from a plurality of sources;
perform, based upon the plurality of datasets, a plurality of steps by implementing the SRNN technique, wherein the plurality of steps comprise:
(i) create a neuron for each of the plurality of datasets, wherein the neuron comprises a record of a plurality of values and time intervals corresponding to at least one dataset amongst the plurality of datasets, and
(ii) identify an optimal dataset corresponding to the one or more Multi-RAT nodes, wherein the optimal dataset comprises optimal non-linked data records for creating a base layer;
create, from the identified optimal dataset, the base layer corresponding to a SRNN model by implementing a Manhattan technique, wherein the base layer comprises a plurality of optimal base layer neurons;
create, using the base layer, the SRNN model comprising a plurality of neurons created for each of the plurality of datasets, wherein the plurality of neurons comprise the plurality of optimal base layer neurons and a plurality of non-base layer neurons;
identify, using the SRNN model, a plurality of optimal neurons by implementing the Manhattan technique; and
optimize the radio nodes using the plurality of optimal neurons identified, wherein the optimization is performed by feeding a set of data values identified from the plurality of optimal neurons to the one or more Multi-RAT nodes.

11. The system (100) as claimed in claim 10, wherein the SRNN model facilitates correlating a set of unstructured information by implementing the SRNN technique, and wherein the set of unstructured information comprises unrelated data entries and time series data corresponding to a plurality of Multi-RATs nodes.

12. The system (100) as claimed in claim 10, wherein the one or more hardware processors (104) are configured to create the SRNN model by:
(i) selecting, a dataset other than the optimal dataset from the plurality of datasets, wherein the selected dataset comprises a plurality of data records;
(ii) computing, by the SRNN technique, a distance between each of the plurality of data records in the selected dataset and each of the plurality of neurons in the SRNN model;
(iii) identifying, using the distance computed, a plurality of closest matching neurons with each of the plurality of neurons by implementing the Manhattan technique;
(iv) adding, based upon the distance computed, a child neuron to a neuron amongst the plurality of neurons, wherein the child neuron is identified from each of the plurality of closest matching neurons; and
(v) iteratively repeating steps (i) to (iv) until each of the plurality of datasets is selected to create the SRNN model.

13. The system (100) as claimed in claim 12, wherein each of the plurality of closest matching neurons comprises a data record with a minimum distance from a neuron, wherein the data record corresponds to the plurality of data records from the selected dataset, and wherein the neuron corresponds to the plurality of neurons.

14. The system (100) as claimed in claim 12, wherein the one or more hardware processors (104) are configured to add the child neuron by:
(i) creating, based upon a time sequence of each of the plurality of data records, a time sequence link between one or more datasets amongst the plurality of datasets by implementing the SRNN technique;
(ii) creating, based upon a logical classification of the time sequence of each of the plurality of data records, a sequence link amongst a plurality of logically classified datasets, wherein each of the plurality of logically classified datasets correspond to at least one of the plurality of datasets; and
(iii) creating, based upon one or more pre-defined set of conditions, a transition link between the SRNN model and a transitional model, wherein the transitional model comprises one or more configuration parameters other than the configuration parameters of the plurality of datasets.

15. The system (100) as claimed in claim 10, wherein the one or more hardware processors (104) are configured to optimize the radio nodes using the plurality of optimal neurons by:
(i) receiving a set of input records corresponding to the one or more Multi-RATs nodes into the SRNN model;
(ii) adding, based upon a comparison of one or more Key Performance Indicator (KPI) values and one or more threshold values, one or more input records amongst the set of input records as a neuron into the SRNN model upon determining the one or more KPI values to be less than or equal the one or more threshold values, wherein the one or more KPI values correspond to the set of input records, and wherein the one or more threshold values correspond to the Multi-RATs nodes; and
(iii) parsing, by implementing the SRNN technique, a closest matching neuron with the neuron corresponding to the one or more input records upon determining the one or more KPI values to be greater than the one or more threshold values.

16. The system (100) of claim 10, wherein the one or more hardware processors (104) are configured to traverse the SRNN model in a direction of a series of time sequence upon determining a non-identification of the plurality of optimal neurons by the SRNN model.

17. The system (100) as claimed in claim 16, wherein the one or more hardware processors (104) are configured to perform the traversing iteratively by implementing the SRNN technique until the plurality of optimal neurons are identified to optimize the radio nodes.

18. The system (100) as claimed in claim 15, wherein the one or more hardware processors (104) are configured to add the one or more input records into the SRNN model by parsing one or more child nodes corresponding to the closest matching neuron by implementing either one of a forward propagation technique or a backward propagation technique.
, Description:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION
(See Section 10 and Rule 13)

Title of invention:

SYSTEMS AND METHODS FOR OPTIMIZATION OF RADIO NODES BY SEQUENTIAL REINFORCEMENT NEURAL NETWORK (SRNN) TECHNIQUE

Applicant

Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India

The following specification particularly describes the invention and the manner in which it is to be performed.


TECHNICAL FIELD
[001] The disclosure herein generally relates to Neural Networks, and, more particularly, to systems and methods for optimization of radio nodes based upon a Sequential Reinforcement Neural Network (SRNN) technique.

BACKGROUND
[002] An Artificial Neural Network (also known as a Neural Network) is a computational model based on the structure and functions of biological neural networks. It is composed of many interconnected units called artificial neurons (also referred to as neurons hereinafter), which are organized in layers. A typical Neural Network comprises three kinds of layers, an input layer, a hidden layer, and an output Layer. The data is fed to the network through the input layer and the result is obtained at the output layer. The intermediate set of neurons in the middle layer initially map the input space to a linearly separable representation where the final decision will be taken. The hidden layer makes the neural network capable of solving non-linear problems.
[003] To train the neural network, a training data set is used. Once a neural network is trained, the neural network can be used to perform pattern recognition or other tasks on a target data set, which contains the target pattern or object to be processed by the neural network. In many commonly-used architectures, neural networks are trained with a set of input signals and corresponding set of desired output signals. The neural networks “learn” the relationships between input and output signals, and thereafter these networks can be applied to a new input signal set to predict corresponding output signals.
[004] Widely used for data classification, the neural networks process past and current data to estimate future values by discovering any complex correlations hidden in the data, in a way analogous to that employed by the human brain. The true power and advantage of neural networks lies in their ability to represent both linear and non-linear relationships, and in their ability to learn these relationships directly from the data being modeled. However, traditional systems and methods implementing neural network(s) for modelling data and / or learning relationships and other purposes tend to suffer and do not offer any viable solutions in case of multi-technology or complex datasets.

SUMMARY
[005] Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one embodiment, a method for optimization of radio nodes based upon a Sequential Reinforcement Neural Network (SRNN) technique is provided, the method comprising: obtaining, by one or more hardware processors, a plurality of datasets comprising data records and configuration parameters corresponding to one or more Multi-Radio Access Technologies (Multi-RATs) nodes from a plurality of sources; performing, based upon the plurality of datasets, a plurality of steps by implementing the SRNN technique, wherein the plurality of steps comprise: (i) creating a neuron for each of the plurality of datasets, wherein the neuron comprises a record of a plurality of values and time intervals corresponding to at least one dataset amongst the plurality of datasets; and (ii) identifying an optimal dataset corresponding to the one or more Multi-RATs nodes, wherein the optimal dataset comprises optimal non-linked data records for creating a base layer; creating, from the identified optimal dataset, the base layer corresponding to a SRNN model by implementing a Manhattan technique, wherein the base layer comprises a plurality of optimal base layer neurons; creating, using the base layer, the SRNN model comprising a plurality of neurons created for each of the plurality of datasets, wherein the plurality of neurons comprise the plurality of optimal base layer neurons and a plurality of non-base layer neurons; identifying, using the SRNN model, a plurality of optimal neurons by implementing the Manhattan technique; optimizing the radio nodes using the plurality of optimal neurons identified, wherein the optimization is performed by feeding a set of data values identified from the plurality of optimal neurons to the one or more Multi-RATs nodes; creating, based upon a time sequence of each of the plurality of data records, a time sequence link between one or more datasets amongst the plurality of datasets by implementing the SRNN technique; creating, based upon a logical classification of the time sequence of each of the plurality of data records, a sequence link amongst a plurality of logically classified datasets, wherein each of the plurality of logically classified datasets correspond to at least one of the plurality of datasets; creating, based upon one or more pre-defined set of conditions, a transition link between the SRNN model and a transitional model, wherein the transitional model comprises one or more configuration parameters other than the configuration parameters of the plurality of datasets; receiving a set of input records corresponding to the one or more Multi-RATs nodes into the SRNN model; adding, based upon a comparison of one or more Key Performance Indicator (KPI) values and one or more threshold values, one or more input records amongst the set of input records as a neuron into the SRNN model upon determining the one or more KPI values to be less than or equal the one or more threshold values, wherein the one or more KPI values correspond to the set of input records, and wherein the one or more threshold values correspond to the Multi-RATs nodes; parsing, by implementing the SRNN technique, a closest matching neuron with the neuron corresponding to the one or more input records upon determining the one or more KPI values to be greater than the one or more threshold values; traversing the SRNN model in a direction of a series of time sequence upon determining a non-identification of the plurality of optimal neurons by the SRNN model; and parsing one or more child nodes corresponding to the closest matching neuron by either one of a forward propagation technique or a backward propagation technique.
[006] In another aspect, there is provided a system for optimization of radio nodes based upon a Sequential Reinforcement Neural Network (SRNN) technique, the system comprising a memory storing instructions; one or more communication interfaces; and one or more hardware processors coupled to the memory via the one or more communication interfaces, wherein the one or more hardware processors are configured by the instructions to: obtain a plurality of datasets comprising configuration parameters corresponding to one or more Multi-Radio Access Technologies (Multi-RATs) nodes from a plurality of sources; perform, based upon the plurality of datasets, a plurality of steps by implementing the SRNN technique, wherein the plurality of steps comprise: (i) create a neuron for each of the plurality of datasets, wherein the neuron comprises a record of a plurality of values and time intervals corresponding to at least one dataset amongst the plurality of datasets, and (ii) identify an optimal dataset corresponding to the one or more Multi-RAT nodes, wherein the optimal dataset comprises optimal non-linked data records for creating a base layer; create, from the identified optimal dataset, the base layer corresponding to a SRNN model by implementing a Manhattan technique, wherein the base layer comprises a plurality of optimal base layer neurons; create, using the base layer, the SRNN model comprising a plurality of neurons created for each of the plurality of datasets, wherein the plurality of neurons comprise the plurality of optimal base layer neurons and a plurality of non-base layer neurons; identify, using the SRNN model, a plurality of optimal neurons by implementing the Manhattan technique; optimize the radio nodes using the plurality of optimal neurons identified, wherein the optimization is performed by feeding a set of data values identified from the plurality of optimal neurons to the one or more Multi-RAT nodes; add a child neuron by: (i) creating, based upon a time sequence of each of the plurality of data records, a time sequence link between one or more datasets amongst the plurality of datasets by implementing the SRNN technique; (ii) creating, based upon a logical classification of the time sequence of each of the plurality of data records, a sequence link amongst a plurality of logically classified datasets, wherein each of the plurality of logically classified datasets correspond to at least one of the plurality of datasets; and (iii) creating, based upon one or more pre-defined set of conditions, a transition link between the SRNN model and a transitional model, wherein the transitional model comprises one or more configuration parameters other than the configuration parameters of the plurality of datasets; optimize the radio nodes using the plurality of optimal neurons by: (i) receiving a set of input records corresponding to the one or more Multi-RATs nodes into the SRNN model; (ii) adding, based upon a comparison of one or more Key Performance Indicator (KPI) values and one or more threshold values, one or more input records amongst the set of input records as a neuron into the SRNN model upon determining the one or more KPI values to be less than or equal the one or more threshold values, wherein the one or more KPI values correspond to the set of input records, and wherein the one or more threshold values correspond to the Multi-RATs nodes; and (iii) parsing, by implementing the SRNN technique, a closest matching neuron with the neuron corresponding to the one or more input records upon determining the one or more KPI values to be greater than the one or more threshold values; perform the traversing iteratively by implementing the SRNN technique until the plurality of optimal neurons are identified to optimize the radio nodes; and parsing one or more child nodes corresponding to the closest matching neuron by implementing either one of a forward propagation technique or a backward propagation technique.
[007] In yet another aspect, there is provided one or more non-transitory machine readable information storage mediums comprising one or more instructions which when executed by one or more hardware processors causes the one or more hardware processors to perform a method for optimization of radio nodes based upon a Sequential Reinforcement Neural Network (SRNN) technique, the method comprising: obtaining a plurality of datasets comprising data records and configuration parameters corresponding to one or more Multi-Radio Access Technologies (Multi-RATs) nodes from a plurality of sources; performing, based upon the plurality of datasets, a plurality of steps by implementing the SRNN technique, wherein the plurality of steps comprise: (i) creating a neuron for each of the plurality of datasets, wherein the neuron comprises a record of a plurality of values and time intervals corresponding to at least one dataset amongst the plurality of datasets; and (ii) identifying an optimal dataset corresponding to the one or more Multi-RATs nodes, wherein the optimal dataset comprises optimal non-linked data records for creating a base layer; creating, from the identified optimal dataset, the base layer corresponding to a SRNN model by implementing a Manhattan technique, wherein the base layer comprises a plurality of optimal base layer neurons; creating, using the base layer, the SRNN model comprising a plurality of neurons created for each of the plurality of datasets, wherein the plurality of neurons comprise the plurality of optimal base layer neurons and a plurality of non-base layer neurons; identifying, using the SRNN model, a plurality of optimal neurons by implementing the Manhattan technique; optimizing the radio nodes using the plurality of optimal neurons identified, wherein the optimization is performed by feeding a set of data values identified from the plurality of optimal neurons to the one or more Multi-RATs nodes; creating, based upon a time sequence of each of the plurality of data records, a time sequence link between one or more datasets amongst the plurality of datasets by implementing the SRNN technique; creating, based upon a logical classification of the time sequence of each of the plurality of data records, a sequence link amongst a plurality of logically classified datasets, wherein each of the plurality of logically classified datasets correspond to at least one of the plurality of datasets; creating, based upon one or more pre-defined set of conditions, a transition link between the SRNN model and a transitional model, wherein the transitional model comprises one or more configuration parameters other than the configuration parameters of the plurality of datasets; receiving a set of input records corresponding to the one or more Multi-RATs nodes into the SRNN model; adding, based upon a comparison of one or more Key Performance Indicator (KPI) values and one or more threshold values, one or more input records amongst the set of input records as a neuron into the SRNN model upon determining the one or more KPI values to be less than or equal the one or more threshold values, wherein the one or more KPI values correspond to the set of input records, and wherein the one or more threshold values correspond to the Multi-RATs nodes; parsing, by implementing the SRNN technique, a closest matching neuron with the neuron corresponding to the one or more input records upon determining the one or more KPI values to be greater than the one or more threshold values; traversing the SRNN model in a direction of a series of time sequence upon determining a non-identification of the plurality of optimal neurons by the SRNN model; and parsing one or more child nodes corresponding to the closest matching neuron by either one of a forward propagation technique or a backward propagation technique.
[008] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.

BRIEF DESCRIPTION OF THE DRAWINGS
[009] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles:
[010] FIG. 1 illustrates a block diagram of a system for optimization of radio nodes based upon a Sequential Reinforcement Neural Network (SRNN) technique, in accordance with some embodiments of the present disclosure.
[011] FIG. 2A through 2B is a flow diagram illustrating the steps involved in the process of the optimization of radio nodes based upon the SRNN technique, in accordance with some embodiments of the present disclosure.
[012] FIG. 3 illustrates an example of a SRNN neuron created for a dataset by implementing the SRNN technique, in accordance with some embodiments of the present disclosure.
[013] FIG. 4 illustrates an example of a base layer created using an optimal dataset, in accordance with some embodiments of the present disclosure.
[014] FIG. 5 through 6 illustrates an example of a SRNN model created, time sequence links created, and child links created between neurons in the SRNN model, in accordance with some embodiments of the present disclosure.
[015] FIG. 7 illustrates a general architecture of the SRNN model created, in accordance with some embodiments of the present disclosure.
[016] FIG. 8 illustrates an example of a sequence link created amongst a plurality of logically classified datasets, in accordance with some embodiments of the present disclosure.
[017] FIG. 9 illustrates an example of a transitional model, wherein the transitional model is crated based upon one or more pre-defined set of conditions, in accordance with some embodiments of the present disclosure.
[018] FIG. 10 illustrates an example of a traversing of the SRNN model and identification of optimal neurons, in accordance with some embodiments of the present disclosure.
[019] FIG. 11 illustrates an example of a reinforcement learning by the SRNN model, in accordance with some embodiments of the present disclosure.

DETAILED DESCRIPTION OF EMBODIMENTS
[020] Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope and spirit being indicated by the following claims.
[021] Embodiments of the present disclosure provide systems and methods for optimization of radio nodes based upon a Sequential Reinforcement Neural Network (SRNN) technique. For large scale data analysis, neural networks have replaced traditional methods of analysis in many fields. The neural networks are better than conventional methods at discovering and identifying from datasets (and even from hidden datasets), dependencies that may not be immediately evident between an individual input data. However, in era of digital and Internet-of-Things (IoT), data is generated from multiple entities to accomplish a particular task. For example in fifth generation (5G) wireless networks, configuration and performance data might generate from multiple Radio Access Technologies (RATs). It is challenging to derive a relationship among such data entities.
[022] Multiple datasets collected from different technologies or sources may be required to feed to model for providing the output. As it would be difficult to derive justification or relationship between these datasets, traditional systems and networks implementing the neural networks and related models fail in providing the appropriate output. In some scenarios, a series of events needs to be analyzed to provide the appropriate output. It is difficult to implement such traditional models for multiple datasets along with data corresponding to .time series events.
[023] The proposed disclosure provides for methodology that overcomes the above limitations of the traditional systems and methods. For example, the proposed methodology provides for correlating multiple datasets from various technologies or sources, and selecting optimal datasets, learning from all data sources and data sets and gaining modeling experience to create and train a model.
[024] Referring now to the drawings, and more particularly to FIG. 1 through 11, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
[025] FIG. 1 illustrates an exemplary block diagram of a system 100 for optimization of radio nodes based upon a Sequential Reinforcement Neural Network (SRNN) technique, in accordance with an embodiment of the present disclosure. In an embodiment, the system 100 includes one or more processors 104, communication interface device(s) or input/output (I/O) interface(s) 106, and one or more data storage devices or memory 102 operatively coupled to the one or more processors 104. The one or more processors 104 that are hardware processors can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) is configured to fetch and execute computer-readable instructions stored in the memory 102. In an embodiment, the system 100 can be implemented in a variety of computing systems, such as laptop computers, notebooks, hand-held devices, workstations, mainframe computers, servers, a network cloud and the like.
[026] The I/O interface device(s) 106 can include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like and can facilitate multiple communications within a wide variety of networks N/W and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. In an embodiment, the I/O interface device(s) can include one or more ports for connecting a number of devices to one another or to another server.
[027] The memory 102 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[028] FIG. 2A through 2B, with reference to FIG. 1, illustrates an exemplary flow diagram of a method for the optimization of radio nodes based upon the SRNN technique, in accordance with an embodiment of the present disclosure. In an embodiment the system 100 comprises one or more data storage devices of the memory 102 operatively coupled to the one or more hardware processors 104 and is configured to store instructions for execution of steps of the method by the one or more processors 104. The steps of the method of the present disclosure will now be explained with reference to the components of the system 100 as depicted in FIG. 1 and the flow diagram. In the embodiments of the present disclosure, the hardware processors 104 when configured the instructions performs one or more methodologies described herein.
[029] According to an embodiment of the present disclosure, at step 201, the one or more hardware processors 104 obtain a plurality of datasets corresponding to one or more Multi-Radio Access Technologies (Multi-RATs) nodes from a plurality of sources. Each of the plurality of datasets obtained comprises data records, configuration parameters, radio optimization parameters, performance counters, performance parameters or Key Performance Indicator (KPI) values corresponding to the RATs and / or Multi-RATs, for example, a set of time intervals, any information element facilitating distribution of load and transmission of traffic in the radio access network and the like.
[030] In an embodiment, the plurality of sources (for obtaining the plurality of datasets) may comprise (but not limited to) radio nodes of any deployed RATs or Multi-RATs (for example, Global System for Mobile communications (GSM) etc.) or any related technologies thereof. Further, as the plurality of datasets may be obtained from the plurality of sources, wherein the plurality of sources comprise the Multi-RATs technologies or any kind of networks), the plurality of datasets may comprise of an unrelated and / or non-correlated data and / or a time-series data, and or any combination thereof.
[031] As is known in the art, the neural networks comprise an input layer, hidden layers (multiple layers for deep learning) and an output layer. Neural networks models are be trained for providing the appropriate labeled output. The neural networks use biology as inspiration for mathematical model. A neural network has inputs, which carry the values of variables of interest to engineer the network, and outputs, which are predictions for the transfer rate in this case. Neurons get signals from previous neurons. Neurons generate signals according to inputs and pass signals on to next neurons. The input, hidden and output neurons need to be connected together. The combination of input, hidden, output neurons and interconnections determine the neural network type (e.g., multi-layer perceptron, radial basis, linear networks).
[032] The traditional systems and methods work only when there is a derived relationship between one or more input values and one or more output values. In real life, data may be generated from the plurality of sources and from a plurality of entities to accomplish one or more tasks to engineer a RATs or a multi-RATs network. The traditional systems and methods fail to provide for deriving a relationship between unrelated and complex data entities. Further, in some scenarios, a series of events (time series) is required to be analyzed to provide the appropriate output. Correlating unrelated data entities along with the time series is a major challenge while implementing traditional neural networks.
[033] The proposed methodology overcomes the limitations of the traditional systems and methods discussed above by providing for the SRNN technique, wherein initially, the plurality of datasets corresponding to the one or more RATs nodes are obtained from the plurality of sources. This approach or technique is called as the Sequential Reinforcement Neural Network (SRNN) technique, wherein the model doesn't depend on the input neurons (parameters) to derive the output. The implementation of the SRNN technique via the plurality of datasets (comprising of the unrelated data entities and the time series events) obtained for overcoming the challenges faced by the traditional systems and methods has been discussed in detail in subsequent steps and paragraphs. Considering an example scenario, the plurality of datasets may be obtained as:
Dataset 1 (10, 20, 30, 01-Jan-18, 9:00AM, 12, 23, 31, 01-Jan-18, 9:15AM, 11, 22, 33, 01-Jan-18, 9:30AM, 14, 21, 35, 01-Jan-18, 9:45AM, 12, 22, 31, 01-Jan-18, 10:00AM);
Dataset 2 (15, 24, 29, 01-Jan-18, 8:00AM, 16, 21, 29, 01-Jan-18, 8:15AM, 14, 24, 32, 01-Jan-18, 8:30AM, 17, 23, 30, 01-Jan-18, 8:45AM, 15, 22, 33, 01-Jan-18, 9:00AM);
Dataset N (12, 13, 14, 01-March-18, 7:00AM, 12, 12, 16, 01-March-18, 7:15AM, 12, 15, 13, 01-March-18, 11, 11, 09, 01-Jan-18, 8:45AM, 13, 12, 11, 01-Jan-18, 9:00AM)
[034] According to an embodiment of the present disclosure, at step 202, the one or more hardware processors 104 perform a plurality of steps based upon the plurality of datasets obtained by implementing the SRNN technique. At step 202(i), the one or more hardware processors 104 create a neuron for each of the plurality of datasets, wherein the neuron comprises a record of a plurality of values and time intervals corresponding to at least one dataset amongst the plurality of datasets. In general, neural networks receive information, which will be processed by a network of neurons through what is known as training sessions. During these training sessions, the network processes or “solves” the problem or detects the patterns. Once learning is complete, the network may be used to predict outputs based upon a given set of inputs.
[035] In an embodiment, initially, the SRNN technique creates SRNN neuron(s) corresponding to each of the plurality of datasets based upon data record(s) in each dataset (amongst the plurality of datasets). Each neuron created represents a set of data record contained in a dataset (amongst the plurality of datasets), wherein the set of data record comprises a values and / or time intervals corresponding to the dataset for which the SRNN neuron(s) have been created. In an example implementation, referring to FIG. 3, the SRNN neuron(s) created for Dataset 1 (amongst the plurality of datasets obtained) may be referred.
[036] According to an embodiment of the present disclosure, at step 202(ii), the one or more hardware processors 104 identify an optimal dataset corresponding to the one or more Multi-RATs nodes. In an embodiment, the optimal dataset comprises optimal non-linked data records for creating a base layer. The optimal dataset is identified from the plurality of datasets obtained. A Dataset amongst the plurality of datasets which has a high distribution spread, that is, a high standard deviation, may be identified as optimal.
[037] The dataset having the high distribution spread is identified as such a dataset may accommodate data mapping with data records of maximum possible neurons. A high standard deviation or high distribution spread facilitates a high degree of data mapping. Considering an example scenario, suppose Dataset 1, Dataset 2, and a Dataset 3 are obtained from the plurality of sources with below records:
Dataset 1 (10, 20, 30, 01-Jan-18, 9:00AM, 12, 23, 31, 01-Jan-18, 9:15AM, 11, 22, 33, 01-Jan-18, 9:30AM, 14, 21, 35, 01-Jan-18, 9:45AM, 12, 22, 31, 01-Jan-18, 10:00AM);
Dataset 2 (15, 24, 29, 01-Jan-18, 8:00AM, 16, 21, 29, 01-Jan-18, 8:15AM, 14, 24, 32, 01-Jan-18, 8:30AM, 17, 23, 30, 01-Jan-18, 8:45AM, 15, 22, 33, 01-Jan-18, 9:00AM); and
Dataset 3 (12, 13, 14, 01-March-18, 7:00AM, 12, 12, 16, 01-March-18, 7:15AM, 12, 15, 13, 01-March-18, 11, 11, 09, 01-Jan-18, 8:45AM, 13, 12, 11, 01-Jan-18, 9:00AM)
It may be noted that Dataset 1 has the high distribution spread or high standard deviation or distribution spread, ranging from 10 till 33, and thus may be identified as the optimal dataset.
[038] According to an embodiment of the present disclosure, at step 203, the one or more hardware processors 104 create, using the optimal dataset, the base layer corresponding to a Sequential Reinforcement Neural Network (SRNN) model. Thus, the base layer is created from the optimal dataset identified from the plurality of datasets. The base layer serves as an input layer to the SRNN model. The process of creating the base layer may now be discussed in detail.
[039] In an embodiment, initially, the one or more hardware processors 104 implement a hierarchical clustering technique to combine clusters, that is, data records in the optimal dataset, that are nearest to each other. Generally, the hierarchical clustering technique is a methodology to a set of sample data patterns for forming a list of function approximation node candidates, and incrementally applying function approximation nodes from the list of function approximation node candidates to form a model with an accuracy at or above a selected accuracy level. Hierarchical clustering of neurons may be performed either by a top down or by a bottom up approach. The most similar pair of clusters are consolidated at each level, starting from one singleton cluster for each compound in the set.
[040] In an embodiment, the proposed disclosure implements the bottom up approach for hierarchically clustering of each data record in the optimal dataset. Each data record in the optimal dataset may be considered as a cluster for the purpose of hierarchically clustering of each data record in the optimal dataset. Considering same example scenario as in step 202 above, suppose the optimal dataset is identified as Dataset 1, wherein the Dataset 1 comprises a set of five data records as:
Dataset 1 (10, 20, 30, 01-Jan-18, 9:00AM, 12, 23, 31, 01-Jan-18, 9:15AM, 11, 22, 33, 01-Jan-18, 9:30AM, 14, 21, 35, 01-Jan-18, 9:45AM, 12, 22, 31, 01-Jan-18, 10:00AM)
[041] Referring to FIG. 4, it may be noted that by implementing the hierarchical clustering technique, data records have been hierarchically clustered at first level as (10, 20, 30, and 11, 22, 33), (12, 23, 31, and 12, 22, 31), and (14, 21, 35). Referring to FIG. 4 again, it may be noted that data record (14, 21, 35) is not clustered as it is not in near distance or near proximation with any other data record from the optimal dataset.
[042] Upon hierarchically clustering each of the data record in the optimal dataset, the one or more hardware processors 104 implement one or more conventional techniques (for example, a Manhattan technique) for creating the base layer, wherein the base layer to be created comprises a plurality of optimal base layer neurons, since the optimal dataset was identified (amongst the plurality of datasets) for creating the base layer. Each data record from the optimal dataset is considered individually for creating the base layer.
[043] In an embodiment, the SRNN technique starts by computing a distance between each of the hierarchically clustered data records by implementing the one or more conventional techniques. Referring to FIG. 4 yet again, it may be noted that by implementing the one or more conventional techniques, the data records from the optimal dataset have been hierarchically clustered at second level as (10.5, 21, 31.5), (12, 22.5, 31), and (14, 21, 35).
[044] The one or more hardware processors 104 then compute an average between each data record hierarchically clustered, and compares a minimum distance between each of the hierarchically clustered data record with a threshold value. If the minimum distance of a hierarchically clustered data record is less than the threshold value, the one or more hardware processors 104 combine the hierarchically clustered data records into a new cluster. Considering an example scenario, suppose the threshold value is 7. Referring to FIG. 4 yet again, it may be noted that distance between hierarchically clustered data records is 2.5, 1.5, and .5 (which is less than the threshold value 7), obtained as a distance between 10 and 12.5, 21 and 22.5, and 31 and 31.5 by implementing the one or more conventional techniques.
[045] The one or more hardware processors 104 create a mean value of the new cluster to create a final new cluster at a final level. The process is repeated by the one or more hardware process 104 until minimum distances of each hierarchically clustered data record reaches the threshold value. Referring to FIG. 4 yet again, it may be noted that at final level, a hierarchical cluster may be obtained as (11.25, 21.75, 31.25), and (14, 21, 25), wherein 11.25 is obtained as a mean of 10.5 and 12, 21.75 is obtained as a mean of 21 and 22.5, and the like. The hierarchical cluster (11.25, 21.75, 31.25), and (14, 21, 25) obtained at the final level, that is, upon hierarchical clustering of each of the data record in the optimal dataset, finally forms the base layer. Referring to FIG. 4 yet again, an example of the base layer created using the optimal dataset may be referred.
[046] According to an embodiment of the present disclosure, at step 204, the one or more hardware processors 104 create using the base layer, the SRNN model comprising a plurality of neurons created for each of the plurality of datasets, wherein the plurality of neurons comprise the plurality of optimal base layer neurons and a plurality of non-base layer neurons. The SRNN model overcomes the limitations of the traditional systems and methods discussed above.
[047] The SRNN model learns and supports multiple data sets / entities from multiple sources, for example, a Multi-RATs / fifth-generation (5G) networks, and is best suited for time-series data. The SRNN model thus facilitates correlating a set of unstructured information by implementing the SRNN technique, and wherein the set of unstructured information comprises unrelated data entries and time series data corresponding to a plurality of Multi-RATs nodes. The implementation of the SRNN model to overcome the challenges faced by the traditional systems and methods may now be considered in detail.
[048] Sequential Reinforcement Neural Network (SRNN) Model - As compared to the traditional systems and methods, the SRNN model learns from all data sources and data sets (that is, corresponding to a plurality of radio access and related technologies) and gains experience based upon the learning. The SRNN model learns from the sequence of events happened over a time series. Still further, when live data is fed into the SRNN model, the SRNN model provides best matching neurons from available neurons for optimizing the radio nodes. The major technical improvement of the SRNN model over the traditional systems and methods comprises an ability to correlate multiple data sets from multiple sources and gaining the experience, wherein a new data set or a new data source may be easily added or deleted from the model. The creation and working of the SRNN model may now be discussed in detail.
[049] In an embodiment, initially, the one or more hardware processors 104 select a dataset other than the optimal dataset from the plurality of datasets, wherein the selected dataset comprises a plurality of data records. Upon selecting the dataset (than the optimal dataset), the one or more hardware processors 104 implement the Manhattan technique to compute a distance between each of the plurality of data records in the selected dataset and each of the plurality of neurons created. As is known in the art, the Manhattan technique computes a distance that would be traveled to get from one data point to the other if a grid-like path is followed. The Manhattan distance between two items is the sum of the differences of their corresponding components.
[050] Further, the one or more hardware processors 104 then identify, using the distance computed, a plurality of closest matching neurons with each of the plurality of neurons by implementing the Manhattan technique, wherein each of the plurality of closest matching neurons comprises a data record with a minimum distance from a neuron, wherein the data record corresponds to the plurality of data records from the selected dataset, and wherein the neuron corresponds to the plurality of neurons already created.
[051] Upon obtaining the distance computed, the one or more hardware processors 104 by implementing the SRNN technique, add a child neuron to a neuron amongst the plurality of neurons, wherein the child neuron is identified from each of the plurality of closest matching neurons, and wherein the child neuron is added to the neuron (amongst the plurality of neurons) which is closely matching with the child neuron. The one or more hardware processors 104 selects each of the plurality of datasets to create the SRNN model, and repeat the steps until each record in each of the plurality of datasets has been considered for the SRNN model creation.
[052] Considering an example scenario, suppose the base layer is created initially with the optimal dataset comprising the plurality of optimal base layer neurons (10, 20, 30), (14, 21, 35), and (12, 22, 31), and the plurality of datasets obtained comprise a total of ten datasets, and Dataset 4 and Dataset 5 are selected initially amongst the total of ten datasets, wherein the Dataset 4 and the Dataset 5 comprise of a set of five records each as:
Dataset 4 (15, 24, 31, 01-Jan-18, 8:00AM, 16, 21, 29, 01-Jan-18, 8:15AM, 14, 24, 32, 01-Jan-18, 8:30AM, 17, 23, 30, 01-Jan-18, 8:45AM, 15, 22, 33, 01-Jan-18, 9:00AM); and
Dataset 5 (09, 08, 10, 01-Jan-18, 8:00AM, 12, 21, 30, 01-Jan-18, 8:15AM, 14, 22, 31 01-Jan-18, 8:30AM, 11, 21, 29, 01-Jan-18, 9:45AM, 13, 22, 31, 01-Jan-18, 9:00AM)
The one or more hardware processors 104 repeat all the steps (discussed above in paragraphs 49 through 51) for each of the set of five records corresponding to the Dataset 4 and the Dataset 5 to create the SRNN model. Referring to FIG. 5 through 6, a sample of the SRNN model created may be referred. Further referring to FIG. 7, a general architecture and working of the SRNN model as may be referred.
[053] Upon adding child neurons, the one or more hardware processors 104 create, based upon a time sequence of each of the plurality of data records, a time sequence link between one or more datasets amongst the plurality of datasets (as explained above) by implementing the SRNN technique. In an embodiment, the time sequence link may be created with a previous record, based upon a time sequence or a time stamp in the two records (that’s is, the child neuron and the previous record). The creation of the time sequence link facilitates identification of the time sequence or the time stamp providing details of time as to when KPIs corresponding to a network under consideration went out of range, and details of a sequence of changes made to restore configuration of the network back.
[054] Considering an example scenario, referring to FIG. 5 yet again, time sequence links created between first and second record, second and third record, third and fourth record and fourth and fifth record may be referred, wherein the first and second record comprise time stamp of 09:00 AM and 09:15 AM respectively, the second and third record comprise time stamp of 09:15 AM and 09:30 AM respectively and so on.
[055] In an embodiment, the one or more hardware processors 104 create, based upon a logical classification of the time sequence of each of the plurality of data records, a sequence link amongst a plurality of logically classified datasets, wherein each of the plurality of logically classified datasets correspond to at least one of the plurality of datasets. Once data is logically classified, a sequence link will be created to make the SRNN model understand continuity of all busy-hour data and as well non-busy hour data. Logical classification comprises classification on the basis of logic(s), for example, on the basis of time etc., for generating a desired output corresponding to the one or more RATs by the SRNN model. For example, if a call drop is a concern, optimizing network for improving a call drop ration may comprise, a logical classification on the basis of a busy-hour and a non-busy hour.
[056] Considering an example scenario for understanding the sequence link, referring to FIG. 8, it may be noted that a sequence link between 10AM data to 5PM data has created to understand via this link about connectivity of busy hour to next busy hour. As the data in the FIG. 8 corresponds to a time-series data, it may be assumed that one or more time sequence links have been created (as explained above) between the interval 7AM to 8AM.
[057] As may be noted referring to FIG. 8 again, 10AM is the last time stamp for busy hour and 5PM again busy hour data starts, and 4PM is the last non busy hour data collected and again at 11PM, non-busy hour data may be collected. Similarly, referring to FIG. 8 yet again, it may be noted that a sequence link between 4PM to 11PM data may be created to make it explicit that there is a sequence link which represents a sequence of complete busy hour or non-busy hour nodes.
[058] In an embodiment, the one or more hardware processors 104 also create a transition link between the SRNN model and a transitional model based upon one or more pre-defined set of conditions, wherein the transitional model comprises one or more configuration parameters other than the configuration parameters of the plurality of datasets. By providing for the transitional model, the proposed disclosure facilitates handling of some exceptional situations, like natural calamities, wherein neurons may comprise of configuration parameters different from normal days configuration parameters. In such scenarios, the transition link may be provided, so that data records or neurons in the SRNN model may get transit to the transition model to handle the exceptional situations. In an example implementation, referring to FIG. 9, an example of the transitional model may be referred.
[059] In embodiment, the steps for creating time sequence links, sequence links and transition links is repeated every time a new dataset (amongst the plurality of datasets) is selected for creating the SRNN model, and whenever new datasets are considered to identify optimal neurons using the SRNN model (explained below). Further, the SRNN model facilitates a reinforcement learning, wherein a data record from the plurality of datasets may be added back to the SRNN model as a child neuron (in the same manner as explained above) by adding a time sequence link and child link. Recalling from the paragraph 48 above that the SRNN model gains experience based upon the learning
[060] As is known in the art, the reinforcement learning is a variant of Machine Learning wherein an agent learns how to behave in an environment by performing actions and seeing the results. Referring to FIG. 11, an example of the reinforcement learning by the SRNN model may be referred, wherein data record is added back into the SRNN model. Referring to FIG. 11 again, it may be noted that the sixth record (15, 24, 32, 01-Jan-18, 10:15 AM is added back into the SRNN model.
[061] According to an embodiment of the present disclosure, at step 205, the one or more hardware processors 104 identify, using the SRNN model, a plurality of optimal neurons by implementing the Manhattan technique. The plurality of optimal neurons comprise neurons with an optimal configuration parameters or with an optimal configuration data, and thus, may be used to optimize the radio nodes (optimization explained in step 206 below). The process of identification of the plurality of optimal neurons may now be considered in detail.
[062] According to an embodiment of the present disclosure, upon creating the SRNN model, the SRNN model becomes ready to take live input data or a set of input records from new datasets (that is, other than the plurality of datasets). Initially, the one or more hardware processors 104 feed the set of input records corresponding to the one or more Multi-RATs nodes into the SRNN model. Since the set of input records are to be identified as optimal or non-optimal, there must be one or more threshold values provided by each vendor of the RATs or Multi-RATs, against which one or more KPI values corresponding to the live data or the set of input records must be compared (so as to identify the set of input records are optimal or optimal). For example, call drop KPI should generally be < 0.02%.
[063] In an embodiment, the one or more hardware processors 104 first perform a comparison of the one or more KPI values and the one or more threshold values, and then add one or more input records amongst the set of input records as a neuron into the SRNN model using the SRNN technique. Thus, based upon the comparison, if the one or more KPI values corresponding to the set of input records are found to be in-range with (that is, less than or equal to) the one or more threshold values, an input record (amongst the set of input records to which the one or more KPI values corresponds to) is added as a neuron to the existing architecture of the SRNN model.
[064] In an embodiment, the input record is added as a neuron, that is, as a child neuron to closest matching neuron, wherein the closest matching neuron may be identified using the Manhattan technique, as explained in the creation of the SRNN model above. Further, the one or more hardware processors 104 also create a time sequence link with previous neuron (or previous record) based upon a time sequence or a time stamp in two records (that is, in the input record and in the previous neuron). Considering an example scenario, if a call drop KPI for a record A (amongst the set of input records) is identified to be < 0.01% based upon the comparison, the record A is added as the child neuron.
[065] According to an embodiment of the present disclosure, if the one or more KPI values of the input record (or the neuron) is identified to be greater than the one or more threshold values, the one or more hardware processors 104 initially compute a distance between the neuron whose one or more KPIs values are identified to be greater than the one or more threshold values (based upon the comparison) and a closest matching neuron in the SRNN model, by implementing the Manhattan technique. The one or more hardware processors 104 parse one or more child neurons corresponding to the closest matching neuron by either one of the forward propagation technique or the backward propagation technique.
[066] In an embodiment, the SRNN technique further parses the time sequence link corresponding to the one or more child neurons until at least one optimal configuration neuron (or at least one optimal configuration child neuron) whose one or more KPI values are less than or equal to the one or more threshold values is identified. Further, a plurality of optimal configuration neurons or a plurality of optimal configuration child neurons may be identified, wherein each of the plurality of optimal configuration neurons or the plurality of optimal configuration child neurons comprise neurons whose one or more KPI values are less than or equal to the one or more threshold values.
[067] In an embodiment, each of the plurality of optimal configuration neurons or the plurality of optimal configuration child neurons may then be added as a neuron to its closely matching neuron in the SRNN model and a time sequence link with its previous record is created.
[068] Thus, the SRNN model is traversed in a direction of a series of time sequence upon determining a non-identification of the plurality of optimal neurons by the SRNN model. The one or more hardware processors 104 perform the step of traversing is performed iteratively by implementing the SRNN technique until the plurality of optimal neurons are identified to optimize the radio nodes. In an embodiment, referring to FIG. 10, a forward traversing may be performed in a direction of the best matching neuron, which is (14, 22, 35).
[069] Referring to FIG. 10 again, an example implementation of the traversing of the SRNN model may be referred, wherein the traversing is performed to generate the plurality of optimal neurons. Referring to FIG. 10 again, it may be noted that the traversing generated (15, 22, 33), (11, 21, 29) as the plurality of optimal records, wherein data record (15, 24, 23) is fed as the set of input records of as the live data into the SRNN model.
[070] According to an embodiment of the present disclosure, at step 206, the one or more hardware processors 104 optimize the radio nodes using the plurality of optimal neurons. The optimization is performed by feeding a set of data values, configuration parameters, performance counters or performance parameters (or any combination thereof) identified from the plurality of optimal neurons to the one or more Multi-RATS nodes. As is known in the art, every neuron comprises a strength value, wherein the strength value may be computed while designing a network. The strength value may be pre-defined as how much time a neuron is able to sustain in a range. Longest sustain neurons, that is, which may be sustained for a long time without going out of range, will have a high strength value.
[071] Based upon the strength value of each of the plurality of optimal neurons, a neuron (amongst the plurality of optimal neurons identified) with a highest strength value may be selected by the one or more hardware processors 104 initially for optimizing the radio nodes. Thus, the set of data values or the configuration parameters corresponding to the neuron with the highest strength value may be fed into a network, and the radio nodes may thus be optimized with best configuration parameters identified from the neuron with the highest strength value. Similarly, the one or more hardware processors 104 may then select a neuron with a second highest strength value from the plurality of optimal neurons, then with a third highest strength value and the like.
[072] According to an embodiment of the present disclosure, some of the advantages of the proposed methodology may be considered in detail. In digital era generation technologies, like 5G cellular technologies, Internet of Things (IoT) etc., data will be generated from multi-technology nodes. It may be difficult to derive the relationship between the input neurons which drives the output neurons. In such cases traditional systems and methods, may fail to provide the desired output. Another technological other gap is all these different technology nodes will generate data in time series.
[073] The proposed disclosure solves these problems faced by the traditional systems and methods via the non-traditional Neural Network model (that is, the SRNN model as shown and discussed above), in which the model doesn't depend on the input neurons (parameters) to derive the output. This approach is called as the Sequential Reinforcement Neural Network (SRNN). The proposed disclosure, via the SRNN model, solves current problem of deriving relationship between input and output neurons and also works for time series data.
[074] The proposed methodology provides for automating the process of the optimization of radio nodes. As is known in the art, radio nodes are configured with various configuration parameters. A radio engineer generally collects performance counters of each node, which are used to calculate the KPIs for the node. Performance counters may have a direct or an indirect or no impact on various KPIs. When the KPIs are degraded, the radio engineer changes a set of configurations to bring back the KPI to a normal state. This is a manual process requires a sound knowledge and experience of the radio engineer experience. In some cases, the radio engineer has to change configuration parameters in trial and error to figure out the correct configuration set for the degradation.
[075] Traditional machine learning algorithms help radio engineers to predict the KPI degradation and cannot suggest optimal configuration changes required. The SRNN model is built with multi-RATs data sets and with events happening in the time series. The SRNN model provides a plurality of optimal configuration parameters required as datasets are collected from multiple entities (wherein the datasets may comprise of the non-correlated or the time series data or even any combination of the two), and then may be fed into the SRNN model for obtaining the optimal neurons comprising of the optimal configuration parameters so optimize the radio nodes.
[076] In an embodiment, the memory 102 can be configured to store any data that is associated with the optimization of radio nodes based upon the SRNN technique. In an embodiment, the information pertaining to the plurality of datasets obtained, the optimal dataset identified, the base layer, the SRNN model, the plurality of optimal neurons identified, and optimizing of the radio nodes using the plurality of optimal neurons identified etc. is stored in the memory 102. Further, all information (inputs, outputs and so on) pertaining to the optimization of radio nodes based upon the SRNN technique, may also be stored in the database, as history data, for reference purpose.
[077] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[078] The embodiments of present disclosure herein addresses unresolved problem of the optimization of radio nodes by correlating unrelated datasets or data series from the multi-RATs, the 5G networks, and other related technologies. The embodiment, thus provides the SRNN model for identification of the plurality of optimal neurons for optimizing the radio nodes. Moreover, the embodiments herein further provides for creating the base layer from the optimal dataset, wherein the created base layer facilitates creation of the SRNN model for the identification of the plurality of optimal neurons.
[079] It is to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[080] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[081] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
[082] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[083] It is intended that the disclosure and examples be considered as exemplary only, with a true scope and spirit of disclosed embodiments being indicated by the following claims.

Documents

Application Documents

# Name Date
1 201821033151-STATEMENT OF UNDERTAKING (FORM 3) [04-09-2018(online)].pdf 2018-09-04
1 201821033151-Written submissions and relevant documents [27-03-2024(online)].pdf 2024-03-27
2 201821033151-REQUEST FOR EXAMINATION (FORM-18) [04-09-2018(online)].pdf 2018-09-04
2 201821033151-FORM-26 [11-03-2024(online)].pdf 2024-03-11
3 201821033151-FORM 18 [04-09-2018(online)].pdf 2018-09-04
3 201821033151-Correspondence to notify the Controller [07-03-2024(online)].pdf 2024-03-07
4 201821033151-FORM-26 [07-03-2024(online)].pdf 2024-03-07
4 201821033151-FORM 1 [04-09-2018(online)].pdf 2018-09-04
5 201821033151-US(14)-HearingNotice-(HearingDate-13-03-2024).pdf 2024-02-13
5 201821033151-FIGURE OF ABSTRACT [04-09-2018(online)].jpg 2018-09-04
6 201821033151-FER.pdf 2021-10-18
6 201821033151-DRAWINGS [04-09-2018(online)].pdf 2018-09-04
7 201821033151-COMPLETE SPECIFICATION [04-09-2018(online)].pdf 2018-09-04
7 201821033151-CLAIMS [02-06-2021(online)].pdf 2021-06-02
8 201821033151-Proof of Right (MANDATORY) [26-09-2018(online)].pdf 2018-09-26
8 201821033151-COMPLETE SPECIFICATION [02-06-2021(online)].pdf 2021-06-02
9 Abstract1.jpg 2018-10-22
9 201821033151-FER_SER_REPLY [02-06-2021(online)].pdf 2021-06-02
10 201821033151-FORM-26 [25-10-2018(online)].pdf 2018-10-25
10 201821033151-OTHERS [02-06-2021(online)].pdf 2021-06-02
11 201821033151-ORIGINAL UR 6(1A) FORM 1-011018.pdf 2019-02-18
11 201821033151-ORIGINAL UR 6(1A) FORM 26-021118.pdf 2019-04-09
12 201821033151-ORIGINAL UR 6(1A) FORM 1-011018.pdf 2019-02-18
12 201821033151-ORIGINAL UR 6(1A) FORM 26-021118.pdf 2019-04-09
13 201821033151-FORM-26 [25-10-2018(online)].pdf 2018-10-25
13 201821033151-OTHERS [02-06-2021(online)].pdf 2021-06-02
14 201821033151-FER_SER_REPLY [02-06-2021(online)].pdf 2021-06-02
14 Abstract1.jpg 2018-10-22
15 201821033151-COMPLETE SPECIFICATION [02-06-2021(online)].pdf 2021-06-02
15 201821033151-Proof of Right (MANDATORY) [26-09-2018(online)].pdf 2018-09-26
16 201821033151-CLAIMS [02-06-2021(online)].pdf 2021-06-02
16 201821033151-COMPLETE SPECIFICATION [04-09-2018(online)].pdf 2018-09-04
17 201821033151-DRAWINGS [04-09-2018(online)].pdf 2018-09-04
17 201821033151-FER.pdf 2021-10-18
18 201821033151-FIGURE OF ABSTRACT [04-09-2018(online)].jpg 2018-09-04
18 201821033151-US(14)-HearingNotice-(HearingDate-13-03-2024).pdf 2024-02-13
19 201821033151-FORM-26 [07-03-2024(online)].pdf 2024-03-07
19 201821033151-FORM 1 [04-09-2018(online)].pdf 2018-09-04
20 201821033151-FORM 18 [04-09-2018(online)].pdf 2018-09-04
20 201821033151-Correspondence to notify the Controller [07-03-2024(online)].pdf 2024-03-07
21 201821033151-REQUEST FOR EXAMINATION (FORM-18) [04-09-2018(online)].pdf 2018-09-04
21 201821033151-FORM-26 [11-03-2024(online)].pdf 2024-03-11
22 201821033151-Written submissions and relevant documents [27-03-2024(online)].pdf 2024-03-27
22 201821033151-STATEMENT OF UNDERTAKING (FORM 3) [04-09-2018(online)].pdf 2018-09-04

Search Strategy

1 2020-12-0414-15-04E_04-12-2020.pdf