Abstract: ABSTRACT METHOD AND SYSTEM OF IDENTIFYING AND REPORTING FOR MUTING EVENTS IN A CORE NETWORK The present disclosure relates to a method of identifying and reporting for muting events in a core network by a processor (202). Further, the method includes obtaining a record corresponding to a timestamp flow creation and a session identification from a SMF module (420) for each of the muting event. Further, the method includes obtaining a cell identification of each UE (102) participating in the voice call from the SMF module (420). Further, the method includes obtaining party details from an IMS for each identified muting event. Further, the method includes determining muting duration, party KPIs with muting times for each identified muting event. Further, the method includes generating a report comprising a region summary and a user level summary based on the timestamp flow creation, the one or more muting timestamps, the session identification, the cell identification and the determination of the muting duration, and the party KPIs. Ref. FIG. 5
DESC:
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
1. TITLE OF THE INVENTION
METHOD AND SYSTEM OF IDENTIFYING AND REPORTING FOR MUTING EVENTS IN A CORE NETWORK
2. APPLICANT(S)
NAME NATIONALITY ADDRESS
JIO PLATFORMS LIMITED INDIAN OFFICE-101, SAFFRON, NR. CENTRE POINT, PANCHWATI 5 RASTA, AMBAWADI, AHMEDABAD 380006, GUJARAT, INDIA
3.PREAMBLE TO THE DESCRIPTION
THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE NATURE OF THIS INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.
FIELD OF THE INVENTION
[0001] The present invention relates to wireless communications, more particularly relates to a system and a method for automatically identifying and mitigating the call muting in a core network for seamless and uninterrupted voice call experiences in fifth generation (5G) networks.
BACKGROUND OF THE INVENTION
[0002] The emergence of 5th Generation (5G) networks has increased the demand for high-speed, low-latency communication with respect to a user data packets. To achieve the high speed and the low latency, efficient handling of a user data traffic is required. Thus, the need for significant changes to a network architecture arises. The User Plane Function (UPF) technology is configured to handle the user data traffic within a 5G and beyond network architectures. Hence the evolution of the UPF plays an important role, since the UPF acts as a crucial component within a 5G core network, responsible for various tasks, including packet forwarding, traffic management, quality of service (QoS) enforcement, and network slicing. Further the UPF enables efficient data handling, supports diverse service requirements, and ensures a seamless user experience in highly dynamic and heterogeneous network environments.
[0003] Further the UPF in a service provider (e.g., JIO or the like) is configured to host multiple user/UE session and it applies different policies (e.g., rate limit, barring, quota, forwarding policies etc.,) on the user packets flowing through the network. The UPF in case of such service provider performs inter networking between the 5G network and a data network and may have a N3 interface defined towards eNodeB (eNB) and a N6 interface defined towards the data network. When the UPF encounters problems such as congestion, misconfiguration, or insufficient capacity, it can lead to packet loss or delays in the transmission of voice data. As a result, call participants may experience muted audio or disruptions during the call for a period, which is called as “call muting”. Due to the call muting, customers may experience the deterioration of voice services, typically indicating compromised network conditions. The call muting occurs due to various common factors, including packet loss at the air, eNB (evolved NodeB), or core network levels including UPF, packet delay at these levels, frequent radio link failures leading to packet loss, and frequent handovers resulting from handover decisions between serving and neighboring cells based on lower signal delta thresholds (B2 Thresholds).
[0004] Further, addressing the call muting in the core network requires a network operators to identify call muting, monitor and optimize the core network's performance, ensure adequate resources to handle voice traffic, and promptly resolve any technical issues that arise. Conventionally, Voice Measurement and Analysis (VoMA), (is an automated solution) is used for detecting packets for mute calls. The VoMA monitors the network quality and user experience, in real-time, for enhancing the user experience. However, during handovers, there may be chances that the VoMA complete analysis is not available due to sampling.
[0005] Thus, there exists a need for a method and system for detecting and mitigating the call muting in a core network to overcome the abovesaid limitations of the prior arts and to provide seamless and uninterrupted voice call experiences in the 5G networks.
SUMMARY OF THE INVENTION
[0006] One or more embodiments of the present disclosure provide a system and a method for automatically identifying and mitigating call muting events in a core network.
[0007] In one aspect of the present invention, a method of identifying and reporting for muting events in a core network is disclosed. The method includes detecting, by a processor, one or more muting events from a plurality of voice calls passing through a user plane function (UPF) in a core network. Further, the method includes obtaining, by the processor, one or more muting timestamps for each of the one or more muting events in the voice call of the plurality of voice calls. Further, the method includes obtaining, by the processor, a timestamp flow creation, a session identification for the one or more muting events from the UPF for each of the muting event. Further, the method includes obtaining, by the processor, a record (e.g., Procedure Detail Records (PDR) or the like) corresponding to the timestamp flow creation and the session identification from a Session Management function (SMF) module for each of the muting event. Further, the method includes obtaining, by the processor, a cell identification of each user equipment (UE) participating in the voice call from the SMF module. Further, the method includes obtaining, by the processor, party details from an Internet Protocol (IP) Multimedia Subsystem (IMS) for each identified muting event. Further, the method includes determining, by the processor, muting duration, party key performance indicators (KPIs) with muting times for each identified muting event. Further, the method includes generating, by the processor, a report comprising a region summary and a user level summary based on the timestamp flow creation, the one or more muting timestamps, the session identification, the cell identification and the determination of the muting duration, and the party KPIs.
[0008] In an embodiment, the method includes identifying, by the processor, a call hold status of the voice call based on the direction of the muting detected from the UPF.
[0009] In an embodiment, the method includes transmitting, by the processor, steaming data records (SDR) upon termination of the voice call.
[0010] In an embodiment, the method includes correlating a direction of the steaming data records (SDR) from each of the user equipment for the muting events in the voice calls.
[0011] In an embodiment, the method includes determining, by the processor, a threshold for detecting the muting events in the voice call.
[0012] In an embodiment, the method includes obtaining, by the processor, the timestamp flow creation and a timestamp flow deletion from the SMF module, the timestamp flow creation indicating a timestamp at which a dedicated Service Data Flow (SDF) is created, the timestamp flow deletion indicating a timestamp at which the dedicated Service Data Flow (SDF) is deleted.
[0013] In an embodiment, detecting the muting events includes detecting, by the processor, packet loss, packet delay, frequent radio link failures, high Radio Resource Control (RCC) connection reestablishment, a handover oscillation, or frequent handovers.
[0014] In another aspect of the present invention, a system for identifying and reporting for muting events in a core network is disclosed. The system includes a processor coupled to a memory. The processor is configured to detect one or more muting events from a plurality of voice calls passing through a UPF in a core network. Further, the processor is configured to obtain one or more muting timestamps for each of the one or more muting events in the voice call of the plurality of voice calls. Further, the processor is configured to obtain a timestamp flow creation, a session identification for the one or more muting events from the UPF for each of the muting event. Further, the processor is configured to obtain a record corresponding to the timestamp flow creation and the session identification from a SMF module for each of the muting event. Further, the processor is configured to obtain a cellular identification of each UE participating in the voice call from the SMF module. Further, the processor is configured to obtain party details from an IMS for each identified muting event. Further, the processor is configured to determine cell identity, muting duration, party KPIs with muting times for each identified muting event. Further, the processor is configured to generate a report comprising a region summary and a user level summary based on the timestamp flow creation, the one or more muting timestamps, the session identification, the cellular identification and the determination of the cell identity, the muting duration, the parties KPIs.
[0015] In another aspect of the present invention, a non-transitory computer-readable medium having stored thereon computer-readable instructions that, when executed by a processor, causes the processor to: detect one or more muting events from a plurality of voice calls passing through a UPF in a core network; obtain one or more muting timestamps for each of the one or more muting events in the voice call of the plurality of voice calls; obtain a timestamp flow creation, a session identification for the one or more muting events from the UPF for each of the muting event; obtain a record corresponding to the timestamp flow creation and the session identification from a SMF module for each of the muting event; obtain a cellular identification of each UE participating in the voice call from the SMF module; obtain party details from an IMS for each identified muting event; determine cell identity, muting duration, party KPIs with muting times for each identified muting event; and generate a report comprising a region summary and a user level summary based on the timestamp flow creation, the one or more muting timestamps, the session identification, the cellular identification and the determination of the cell identify, the muting duration, the party KPIs.
[0016] Other features and aspects of this invention will be apparent from the following description and the accompanying drawings. The features and advantages described in this summary and in the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the relevant art, in view of the drawings, specification, and claims hereof. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes and may not have been selected to delineate or circumscribe the inventive subject matter, resort to the claims being necessary to determine such inventive subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components, electronic components or circuitry commonly used to implement such components.
[0018] FIG. 1 is an exemplary block diagram of an environment for automatically identifying and mitigating call muting events in a core network, according to various embodiments of the present disclosure.
[0019] FIG. 2 is a block diagram of a system of FIG. 1, according to various embodiments of the present disclosure.
[0020] FIG. 3 is an example schematic representation of the system of FIG. 1 in which various entities operations are explained, according to various embodiments of the present system.
[0021] FIG. 4 shows a block diagram of an example system for automatically identifying and mitigating the call muting events in the core network, in accordance with an exemplary embodiment of the present disclosure.
[0022] FIG. 5 shows a sequence flow diagram illustrating a method for automatically identifying and mitigating call muting events in the core network, according to various embodiments of the present disclosure.
[0023] FIG. 6 illustrates an example flow chart for a method for automatically identifying and mitigating the call muting events in a core network in accordance with the another embodiment of the present disclosure.
[0024] FIG. 7 is an example sequence diagram illustrating a method for automatically identifying and mitigating the call muting events in the core network in accordance with the another embodiment of the present disclosure.
[0025] Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present invention. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
[0026] The foregoing shall be more apparent from the following detailed description of the invention.
DETAILED DESCRIPTION OF THE INVENTION
[0027] Some embodiments of the present disclosure, illustrating all its features, will now be discussed in detail. It must also be noted that as used herein and in the appended claims, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise.
[0028] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure including the definitions listed here below are not intended to be limited to the embodiments illustrated but is to be accorded the widest scope consistent with the principles and features described herein.
[0029] A person of ordinary skill in the art will readily ascertain that the illustrated steps detailed in the figures and here below are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0030] Before discussing example, embodiments in more detail, it is to be noted that the drawings are to be regarded as being schematic representations and elements that are not necessarily shown to scale. Rather, the various elements are represented such that their function and general purpose becomes apparent to a person skilled in the art. Any connection or coupling between functional blocks, devices, components, or other physical or functional units shown in the drawings or described herein may also be implemented by an indirect connection or coupling. A coupling between components may also be established over a wireless connection. Functional blocks may be implemented in hardware, firmware, software or a combination thereof.
[0031] Further, the flowcharts provided herein, describe the operations as sequential processes. Many of the operations may be performed in parallel, concurrently or simultaneously. In addition, the order of operations maybe re-arranged. The processes may be terminated when their operations are completed, but may also have additional steps not included in the figured. It should be noted, that in some alternative implementations, the functions/acts/ steps noted may occur out of the order noted in the figured. For example, two figures shown in succession may, in fact, be executed substantially concurrently, or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
[0032] Further, the terms first, second etc… may be used herein to describe various elements, components, regions, layers and/or sections, it should be understood that these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are used only to distinguish one element, component, region, layer or section from another region, layer, or a section. Thus, a first element, component, region layer, or section discussed below could be termed a second element, component, region, layer, or section without departing form the scope of the example embodiments.
[0033] Spatial and functional relationships between elements (for example, between modules) are described using various terms, including “connected,” “engaged,” “interfaced,” and “coupled.” Unless explicitly described as being “direct,” when a relationship between first and second elements is described in the description below, that relationship encompasses a direct relationship where no other intervening elements are present between the first and second elements, and also an indirect relationship where one or more intervening elements are present (either spatially or functionally) between the first and second elements. In contrast, when an element is referred to as being "directly” connected, engaged, interfaced, or coupled to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., "between," versus "directly between," "adjacent," versus "directly adjacent," etc.).
[0034] The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0035] As used herein, the singular forms “a,” “an,” and “the,” are intended to include the plural forms as well, unless the context clearly indicates otherwise. As used herein, the terms “and/or” and “at least one of” include any and all combinations of one or more of the associated listed items. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0036] Unless specifically stated otherwise, or as is apparent from the description, terms such as “processing” or “computing” or “calculating” or “determining” of “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device/hardware, that manipulates and transforms data represented as physical, electronic quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0037] The following abbreviation are used in the patent disclosure.
- 5G: 5th Generation technology standard for broadband cellular networks, which cellular phone. Further N3 interface is defined between the UPF and the eNodeB in the 5G architecture.
- eNB : eNodeB
- eNodeB: eNodeB stands for eNB (Evolved Node B)
- N3: N3 is a frequency band designated by the 5G NR standard. It is also referred to as the 1.8 GHz 5G band.
- NIC: Network Interface Card.
- N6: The N6 interface (also called N6 reference point or N6 LAN) is the demarcation point for traffic between mobile devices and the internet in the 5G network1. It provides connectivity between the User Plane Function (UPF) and any other external (or internal) networks or service platforms, such as the Internet, the public cloud or private clouds.
- SMF : Session Management function
- PCF : Policy Control Function
- IMS : IP Multimedia Subsystem
- AI/ML :Artificial Intelligence (AI) and Machine Learning (ML)
[0038] Various embodiments of the invention provide a method of identifying and reporting for muting events in a core network. The method includes detecting, by a processor, one or more muting events from a plurality of voice calls passing through a UPF in a core network. Further, the method includes obtaining, by the processor, the one or more muting timestamps for each of the one or more muting events in the voice call of the plurality of voice calls. Further, the method includes obtaining, by the processor, a timestamp flow creation, a session identification for the one or more muting events from the UPF for each of the muting event. Further, the method includes obtaining, by the processor, a record corresponding to the timestamp flow creation and the session identification from a SMF module for each of the muting event. Further, the method includes obtaining, by the processor, a cell identification of each UE participating in the voice call from the SMF module. Further, the method includes obtaining, by the processor, party details from an IMS for each identified muting event. Further, the method includes determining, by the processor, muting duration, party KPIs with muting times for each identified muting event. Further, the method includes generating, by the processor, a report comprising a region summary and a user level summary based on the timestamp flow creation, the one or more muting timestamps, the session identification, the cell identification and the determination of the muting duration, and the party KPIs.
[0039] Various embodiments of the invention provide a system and a method for automatically identifying and mitigating the call muting in a core network. The present invention provides a comprehensive method for detecting a muting event in core network for an ongoing call and mitigating the same. The present invention provides a method comprising identifying all “Muting Timestamps” such as for example, the Muting Timestamp array to recover the number of muting events for all calls and capturing key performance indicators (KPIs) of the network comprising timestamp, cell identity, muting duration, sender/receiver party for each muting call event. Further, the system automatically generates a report on the call muting event and provides a user level summary to the end user.
[0040] FIG. 1 illustrates an exemplary block diagram of an environment (100) for automatically identifying and mitigating call muting events in a core network, according to various embodiments of the present disclosure. The environment (100) comprises a plurality of user equipment’s (UEs) 102-1, 102-2, ……,102-n. The at least one UE (102-n) from the plurality of the UEs (102-1, 102-2, ……102-n) is configured to connect to a User Plane Function (UPF) (108) and a system (110) via the communication network (106). Hereafter, label for the plurality of UEs or one or more UEs is 102.
[0041] In accordance with yet another aspect of the exemplary embodiment, the plurality of UEs (102) may be a wireless device or a communication device that may be a part of the environment (100). The wireless device or the UE (102) may include, but are not limited to, a handheld wireless communication device (e.g., a mobile phone, a smart phone, a phablet device, and so on), a wearable computer device (e.g., a head-mounted display computer device, a head-mounted camera device, a wristwatch computer device, and so on), a laptop computer, a tablet computer, or another type of portable computer, a media playing device, a portable gaming system, and/or any other type of computer device with wireless communication or VoIP capabilities. In an embodiment, the UEs may include, but are not limited to, any electrical, electronic, electro-mechanical or an equipment or a combination of one or more of the above devices such as virtual reality (VR) devices, augmented reality (AR) devices, laptop, a general-purpose computer, desktop, personal digital assistant, tablet computer, mainframe computer, or any other computing device, wherein the computing device may include one or more in-built or externally coupled accessories including, but not limited to, a visual aid device such as camera, audio aid, a microphone, a keyboard, input devices for receiving input from a user such as touch pad, touch enabled screen, electronic pen and the like. It may be appreciated that the UEs may not be restricted to the mentioned devices and various other devices may be used. A person skilled in the art will appreciate that the plurality of UEs (102) may include a fixed landline, a landline with assigned extension within the communication network (106).
[0042] The plurality of UEs (102) may comprise a memory such as a volatile memory (e.g., RAM), a non-volatile memory (e.g., disk memory, FLASH memory, EPROMs, etc.), an unalterable memory, and/or other types of memory. In one implementation, the memory might be configured or designed to store data. The data may pertain to attributes and access rights specifically defined for the plurality of UEs (102). The UE (102) may be accessed by the user, to receive the requests related to an order determined by the system (110) or the UPF (108). The communication network (106), may use one or more communication interfaces/protocols such as, for example, Voice Over Internet Protocol (VoIP), 802.11 (Wi-Fi), 802.15 (including Bluetooth™), 802.16 (Wi-Max), 802.22, Cellular standards such as Code Division Multiple Access (CDMA), CDMA2000, Wideband CDMA (WCDMA), Radio Frequency Identification (e.g., RFID), Infrared, laser, Near Field Magnetics, etc.
[0043] The UE (102) may also have the application embedded in a memory of the UE (102). The application may send or extract/retrieve information from the User Plane Function (108), i.e., connected to the UE (102) through the communication network (106).
[0044] The UPF (108) is communicatively coupled to a server (104) and the system (110) via the communication network (106). The server (104) can be, for example, but not limited to a standalone server, a server blade, a server rack, an application server, a bank of servers, a business telephony application server (BTAS), a server farm, a cloud server, an edge server, home server, a virtualized server, one or more processors executing code to function as a server, or the like. In an implementation, the server (104) may operate at various entities or a single entity (include, but is not limited to, a vendor side, a service provider side, a network operator side, a company side, an organization side, a university side, a lab facility side, a business enterprise side, a defence facility side, or any other facility) that provides service.
[0045] The communication network (106) includes, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, or some combination thereof. The communication network (106) may include, but is not limited to, a Third Generation (3G), a Fourth Generation (4G), a Fifth Generation (5G), a Sixth Generation (6G), a New Radio (NR), a Narrow Band Internet of Things (NB-IoT), an Open Radio Access Network (O-RAN), and the like.
[0046] The communication network (106) may also include, by way of example but not limitation, at least a portion of one or more networks having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, waves, voltage or current levels, some combination thereof, or so forth. The communication network (106) may also include, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, a VOIP or some combination thereof.
[0047] One or more network elements can be, for example, but not limited to a base station that is located in the fixed or stationary part of the communication network (106). The base station may correspond to a remote radio head, a transmission point, an access point or access node, a macro cell, a small cell, a micro cell, a femto cell, a metro cell. The base station enables transmission of radio signals to the UE or mobile transceiver. Such a radio signal may comply with radio signals as, for example, standardized by a 3GPP or, generally, in line with one or more of the above listed systems. Thus, a base station may correspond to a NodeB, an eNodeB, a Base Transceiver Station (BTS), an access point, a remote radio head, a transmission point, which may be further divided into a remote unit and a central unit.
[0048] 3GPP: The term “3GPP” is a 3rd Generation Partnership Project and is a collaborative project between a group of telecommunications associations with the initial goal of developing globally applicable specifications for Third Generation (3G) mobile systems. The 3GPP specifications cover cellular telecommunications technologies, including radio access, core network, and service capabilities, which provide a complete system description for mobile telecommunications. The 3GPP specifications also provide hooks for non-radio access to the core network, and for networking with non-3GPP networks.
[0049] The system (110) may include one or more processors (202) coupled with a memory (204), wherein the memory (204) may store instructions which when executed by the one or more processors (202) may cause the system (110) executing requests in the communication network (106) or the server (104). An exemplary representation of the system (110) for such purpose, in accordance with embodiments of the present disclosure, is shown in FIG. 2 as system (110). In an embodiment, the system (110) may include one or more processor(s) (202). The one or more processor(s) (202) may be implemented as one or more microprocessors, microcomputers, microcontrollers, edge or fog microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that process data based on operational instructions. Among other capabilities, the one or more processor(s) (202) may be configured to fetch and execute computer-readable instructions stored in the memory (204) of the system (110). The memory (204) may be configured to store one or more computer-readable instructions or routines in a non-transitory computer readable storage medium, which may be fetched and executed to create or share data packets over a network service.
[0050] The system (110) is communicably coupled to the remote server (104) and each UE of the plurality of UEs (102) via the communication network (106). The remote server (104) is configured to execute the requests in the communication network (106).
[0051] The system (110) is adapted to be embedded within the remote server (104) or is embedded as the individual entity. The system (110) is designed to provide a centralized and unified view of data and facilitate efficient business operations. The system (110) is authorized to access to update/create/delete one or more parameters of their relationship between the requests for the workflow, which gets reflected in real-time independent of the complexity of network.
[0052] In another embodiment, the system (110) may include an enterprise provisioning server (for example), which may connect with the remote server (104). The enterprise provisioning server provides flexibility for enterprises, ecommerce, finance to update/create/delete information related to the requests in real time as per their business needs. A user with administrator rights can access and retrieve the requests for the workflow and perform real-time analysis in the system (110).
[0053] The system (110) may include, by way of example but not limitation, one or more of a standalone server, a server blade, a server rack, a bank of servers, a business telephony application server (BTAS), a server farm, hardware supporting a part of a cloud service or system, a home server, hardware running a virtualized server, one or more processors executing code to function as a server, one or more machines performing server-side functionality as described herein, at least a portion of any of the above, some combination thereof. In an implementation, system (110) may operate at various entities or single entity (for example include, but is not limited to, a vendor side, service provider side, a network operator side, a company side, an organization side, a university side, a lab facility side, a business enterprise side, ecommerce side, finance side, a defence facility side, or any other facility) that provides service.
[0054] However, for the purpose of description, the system (110) is described as an integral part of the remote server (104), without deviating from the scope of the present disclosure. Operational and construction features of the system (110) will be explained in detail with respect to the following figures.
[0055] FIG. 2 illustrates a block diagram of the system (110) provided for automatically identifying and mitigating call muting events in the core network, according to one or more embodiments of the present invention. The system (110) can be, for example, but not limited to a troubleshooting platform (450) (as explained in FIG. 4). As per the illustrated embodiment, the system (110) includes the one or more processors (202), the memory (204), an input/output interface unit (206), a display (208), an input device (210), and a centralized database (or database) (214). Further the system (110) may comprise one or more processors (202). The one or more processors (202), hereinafter referred to as the processor (202) may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, single board computers, and/or any devices that manipulate signals based on operational instructions. As per the illustrated embodiment, the system (110) includes one processor. However, it is to be noted that the system (110) may include multiple processors as per the requirement and without deviating from the scope of the present disclosure.
[0056] The information related to the request may be provided or stored in the memory (204) of the system (110). Among other capabilities, the processor (202) is configured to fetch and execute computer-readable instructions stored in the memory (204). The memory (204) may be configured to store one or more computer-readable instructions or routines in a non-transitory computer-readable storage medium, which may be fetched and executed to create or share data packets over a network service. The memory (204) may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as disk memory, EPROMs, FLASH memory, unalterable memory, and the like.
[0057] The memory (204) may comprise any non-transitory storage device including, for example, volatile memory such as Random-Access Memory (RAM), or non-volatile memory such as Electrically Erasable Programmable Read-only Memory (EPROM), flash memory, and the like. In an embodiment, the system (110) may include an interface(s). The interface(s) may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as input/output (I/O) devices, storage devices, and the like. The interface(s) may facilitate communication for the system. The interface(s) may also provide a communication pathway for one or more components of the system. Examples of such components include, but are not limited to, processing unit/engine(s) and a database. The processing unit/engine(s) may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s).
[0058] The information related to the requests may further be configured to render on the user interface (206). The user interface (206) may include functionality similar to at least a portion of functionality implemented by one or more computer system interfaces such as those described herein and/or generally known to one having ordinary skill in the art. The user interface (206) may be rendered on the display (208), implemented using Liquid Crystal Display (LCD) display technology, Organic Light-Emitting Diode (OLED) display technology, and/or other types of conventional display technology. The display (208) may be integrated within the system (110) or connected externally. Further the input device(s) (210) may include, but not limited to, keyboard, buttons, scroll wheels, cursors, touchscreen sensors, audio command interfaces, magnetic strip reader, optical scanner, etc.
[0059] The centralized database (214) may be communicably connected to the processor (202) and the memory (204). The centralized database (214) may be configured to store and retrieve the request pertaining to features, or services or workflow of the system (110), access rights, attributes, approved list, and authentication data provided by an administrator. Further the remote server (104) may allow the system (110)to update/create/delete one or more parameters of their information related to the request, which provides flexibility to roll out multiple variants of the request as per business needs. In another embodiment, the centralized database (214) may be outside the system (110)and communicated through a wired medium and wireless medium.
[0060] Further, the processor (202), in an embodiment, may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor (202). In the examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor (202) may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processor (202) may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the memory (204) may store instructions that, when executed by the processing resource, implement the processor (202). In such examples, the system (110)may comprise the memory (204) storing the instructions and the processing resource to execute the instructions, or the memory (204) may be separate but accessible to the system (110)and the processing resource. In other examples, the processor (202) may be implemented by an electronic circuitry.
[0061] In order for the system (110) to identify and report muting events in the core network, the processor (202) includes a muting events controller (216). The muting events controller (216) may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor (202). In the examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor (202) may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processor (202) may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the memory (204) may store instructions that, when executed by the processing resource, implement the processor. In such examples, the system (110) may comprise the memory (204) storing the instructions and the processing resource to execute the instructions, or the memory (204) may be separate but accessible to the system (110)and the processing resource. In other examples, the processor (202) may be implemented by the electronic circuitry.
[0062] In an example embodiment, the processor (202) detects one or more muting events from a plurality of voice calls passing through the UPF (108) in the core network. The muting events are detected based on packet loss, packet delay, frequent radio link failures, high Radio Resource Control (RCC) connection reestablishment, a handover oscillation, or frequent handovers. In an example, the processor (202) determines the “Muting Timestamp Array” to recover the number of muting events for that call and also individual Muting Timestamp. This individual Muting timestamp from the UPF (108) will become an outer limit for the timestamp for the SMF Packet Detection Rule (PDR) searching for particular Subscriber Permanent Identifier (SUPI). A session ID from the UPF (108) and the SMF module (420) are matched and the PDR of SMF are fetched by the system (110).
[0063] The processor (202) checks a “Request Timestamp” of subsequent PDRs till it reaches PDR which has “Request Timestamp” greater than individual Muting Timestamp recovered from the UPF SDR “Muting Timestamp Array”. The processor (202) will select the previous PDR for same SUPI (PDR for same SUPI which is just before in sequence where Request Timestamp is less than or equal to Muting timestamp from the UPF (108)). The processor (202) will record a “UE Location Present” field from this PDR which will denote a Cell ID on which the user of the UE was present when the muting was detected.
[0064] In an example, let’s consider following for SUPI value “A” at both the UPF (108) and the SMF module (420) (only limited fields showed).
UPF SDR – “Session-ID: ABC”, “Timestamp Flow Creation = 13:23:15.170”,”Muting Timestamp Array = [13:23:57.345, 13:24:19.544]”
SMF PDR1 – “Session-ID: ABC”, “Request Timestamp = 13:23:15.123”,”UE Location Present =0007b33”
SMF PDR2 – “Session-ID: ABC”, “Request Timestamp = 13:23:45.918”,”UE Location Present =0007b34”
[0065] In above example, 1st Muting at 13:23:57.345 will match SMF PDR2 with Cell ID “0007b34” and 2nd Muting at 13:24:19.544 will match cell ID “0007b35” in SMF PDR3 for call for SUPI A in an uplink direction where a call originating cell is “0007b33” for matching with total calls. Basically, the processor (202) acts as a correlation point, that receives muting data from multiple sources. The multiple sources are combined so as to improve the accuracy of identifying and reporting of the muting events.
[0066] Further, the processor (202) obtains the one or more muting timestamps such as for example, the muting timestamp arrays for each of the one or more muting events in the voice call of the plurality of voice calls from the UPF (108). The UPF (108) detects the muting events locally, and when requested by the system (110) (e.g., troubleshooting platform or the like), the system (110) receives the “muting time stamp array” of calls in a Streaming data record (SDR). The muting time stamp arrays refer to a data structure or array that stores timestamps indicating when muting events occurred. The muting timestamp array includes a muting event, timestamp, and timestamp array. The muting event refers to an event where audio or video transmission is intentionally muted, often initiated by a user action or automated system response. The timestamp is a record of the exact date and time when a particular event occurs. It helps in tracking and synchronizing events in chronological order. The timestamp array is a data structure that stores multiple timestamps, typically in a sequential order, allowing for easy retrieval and manipulation of time-related data. The muting timestamp array is the array of timestamps at which the muting event occurred in the call. In the muting timestamp array, a field is optional and present only if the muting detected is set to True. Further, the number of timestamps present will vary from call to call depending upon number of mutings detected and limited maximum to configurable field in the UPF (108).
[0067] Further, the processor (202) obtains the timestamp flow creation, and the session identification for the one or more muting events from the UPF (108) for each of the muting event. The timestamp flow creation means the timestamp at which dedicated SDF was created (in the form of MM dd yyyy HH mm ss SSS). This field is also in the UPF’s SDR along with timestamp array. The session identification refers to the ability to uniquely identify and associate a specific communication session where muting or unmuting actions occur. This identification is crucial for various purposes, including analytics, troubleshooting, and managing user interactions within the communication system. The session identification is determined by a session identifier (ID) assignment and an event logging details. Further, the processor (202) obtains the record (e.g., the PDR or the like)) corresponding to the timestamp flow creation and the session identification from the SMF module (420) for each of the muting event. Further, the processor (202) obtains the cellular identification of each UE (102) participating in the voice call from the SMF module (420).
[0068] Further, the processor (202) obtains party details from the IMS (440) for each identified muting event. In an example, the system (110) receives party details (e.g., call detail records (CDR) or the like) from the Converged Telephony Application Server (CTAS) via the IMS (440). The CDR includes call receiver party number and details. Further, the CDR includes the number (e.g., IMSI, IMEI, MSISDN or the like), call start time, call end time, a call status (e.g., answered, disconnected, busy or the like), cell IDs (e.g., start of call, end of call or the like), and a call release reason (CRR). Further, the processor (202) determines the cell identity, muting duration, party KPIs with muting times for each identified muting event. The party KPIs can be, for example, but not limited to muting occurrence, worst cells, handover analysis, region wise information, spectrum wise information or the like. Further, the processor (202) generates the report including a region summary and a user level summary based on the timestamp flow creation, the muting timestamp array, the session identification, the cellular identification and the determination of the cell identity, the muting duration, the parties KPIs. The region summary typically refers to a concise overview or report that summarizes the muting events or activities across different geographical regions or areas within the communication network (106). In other words, the region summary aggregates data related to muting events from different geographical regions. It provides a high-level view of muting activities across these regions, often in a tabular or graphical format. The user level summary refers to a detailed overview or report that summarizes muting events or activities for individual users within the communication network (106). The user level summary provides insights into how each user interacts with muting features during their participation in voice or video calls. The mitigation for the muting events is performed with the help of the analytics report generated by the troubleshooting platform.
[0069] Further, the processor (202) identifies a call hold status of the voice call based on a direction of the muting detected. The direction of muting can be, for example, but not limited to an uplink direction and a downlink direction. The direction of muting is stored in a field in the SDR. The call hold status is identified because muting event detection will not be done for the hold calls cases to avoid false indication.
[0070] Further, the processor (202) transmits steaming data records (SDR) upon termination of the voice call. The SDR also includes field (e.g., user identity (e.g., Subscription Permanent Identifier (SUPI), Generic Public Subscription Identifier (GPSI), Permanent Equipment Identifier (PEI) or the like), UPF ID (e.g., identifier if the UPF instance), timestamp flow deletion, call hold counter, call hold timestamp, threshold which will determine the maximum number of muting events that could be detected in the call or the like).
[0071] Further, the processor (202) correlates a direction of the steaming data records (SDR) from each of the UE (102) for the muting events in the voice calls.
[0072] Further, the processor (202) determines a threshold for detecting the muting events in the voice call. The threshold determines the maximum number of the muting events that could be detected in the call.
[0073] Further, the processor (202) obtains a timestamp flow deletion from the SMF module (420), wherein the timestamp flow deletion indicates a timestamp at which the dedicated SDF is deleted. The service data flow (SDF) describes a structured pathway through which data related to muting events or actions (or any service that is provided to subscriber is present in the SDF) is processed, stored, and utilized within communication or telecommunication services. In other words, the SDF describes a path and processes the data packets within the communication network (106). The SDF provides a structured representation of how data is transmitted and processed between network elements or network functions to deliver a specific service or an application. The network elements or network function (NF) can be for example, but not limited to the UPF (108), the SMF module (420), the PCF module (430), etc. It ensures effective management of the muting event data for operational insights and service optimization.
[0074] The above operations are explained in the view of the processor (202), but the same operations are handled by the muting events controller (216).
[0075] FIG. 3 is an example schematic representation of the system (300) of FIG. 1 in which various entities operations are explained, according to various embodiments of the present system. It is to be noted that the embodiment with respect to FIG. 3 will be explained with respect to the first UE (102-1) and the system (110) for the purpose of description and illustration and should nowhere be construed as limited to the scope of the present disclosure.
[0076] As mentioned earlier, the first UE (102-1) includes one or more primary processors (305) communicably coupled to the one or more processors (202) of the system (110). The one or more primary processors (305) are coupled with a memory (310) storing instructions which are executed by the one or more primary processors (305). Execution of the stored instructions by the one or more primary processors (305) enables the UE (102-1). The execution of the stored instructions by the one or more primary processors (305) further enables the UE (102-1) to execute the requests in the communication network (106).
[0077] As mentioned earlier, the one or more processors (202) is configured to transmit a response content related to the request to the UE (102-1). More specifically, the one or more processors (202) of the system (110) is configured to transmit the response content from a kernel (315) to at least one of the UE (102-1). The kernel (315) is a core component serving as the primary interface between hardware components of the UE (102-1) and the system (110). The kernel (315) is configured to provide the plurality of response contents hosted on the system (110) to access resources available in the communication network (106). The resources include one of a Central Processing Unit (CPU), memory components such as Random Access Memory (RAM) and Read Only Memory (ROM).
[0078] As per the illustrated embodiment, the system (110) includes the one or more processors (202), the memory (204), the input/output interface unit (206), the display (208), and the input device (210). The operations and functions of the one or more processors (202), the memory (204), the input/output interface unit (206), the display (208), and the input device (210) are already explained in FIG. 2. For the sake of brevity, we are not explaining the same operations (or repeated information) in the patent disclosure. Further, the processor (202) includes the muting events controller (216). The operations and functions of the muting events controller (216) are already explained in FIG. 2. For the sake of brevity, we are not explaining the same operations (or repeated information) in the patent disclosure.
[0079] FIG. 4 shows a block diagram of an example system (400) for automatically identifying and mitigating the call muting events in the core network, in accordance with an exemplary embodiment of the present disclosure. The system (400) comprises the UE (102) of a user, an Evolved Node B (eNB) (410) connecting the user equipment via an interface and to the core network including a User Plane Function (UPF) (108) via N3 interface. The User Plane Function (UPF) (108) is responsible for packet forwarding, traffic management, and quality of service (QoS) enforcement. The system further comprises a Session Management function (SMF) module (420) connected with the core network through an N4 interface, a Policy Control Function (PCF) module (430) connected with the SMF module (420) through an N7 interface, an IP Multimedia Subsystem (IMS) (440) connected with the core network through an N6 interface, and an troubleshooting platform (450) and communicably connected with the UPF (108), the SMF module (420), IP Multimedia Subsystem (IMS) (440)through integration with Northbound Interface (NBI) (460).
[0080] In an embodiment of the present subject matter, the troubleshooting platform (450) identifies a plurality of “Muting Timestamp Array” to recover the number of muting events for all calls from the UPF (108). The muting timestamp array refers to a data structure or array that stores timestamps indicating when muting events occurred. The muting timestamp array includes a muting event, timestamp, and timestamp array. The muting event refers to an event where audio or video transmission is intentionally muted, often initiated by a user action or automated system response. The timestamp is a record of the exact date and time when a particular event occurs. It helps in tracking and synchronizing events in chronological order. The timestamp array is a data structure that stores multiple timestamps, typically in a sequential order, allowing for easy retrieval and manipulation of time-related data. The troubleshooting platform (450) further obtains "Timestamp Flow Creation" and session id of mute call from the UPF (108) for each identified muting call. The troubleshooting platform (450) further obtains a record corresponding to "Timestamp Flow Creation" and session ID from the SMF module (420) for each identified muting call. The troubleshooting platform (450) further obtains the cellID from the SMF module (420) for each identified muting call. The troubleshooting platform (450) further obtains an information of the Bparty for each identified muting call, which includes details such as telecom operator, B-party number and equipment model from a Converged Telephony Application Server (CTAS) via the IMS (440). Further, the troubleshooting platform (450) determines Cell Identity, Muting duration, sender/receiver party KPI’s with muting times for each identified muting call and generates a report of call muting for each cell identity. The troubleshooting platform (450) further generates presence across nation (PAN) summary and user level summary and provides to the end user.
[0081] In another embodiment of the present subject matter, the system may comprise an AI/ML module for automatically identifying the call muting event and mitigating the same. The AI/ML module may implement all functions of the troubleshooting platform in the above embodiments and generate report of call muting for each cell identity. The AI/ML module may automatically generate presence across nation (PAN) summary and user level summary and provides to the end user. The AI/ML module aggregates data from call muting events across different geographical regions (nations). This involves collecting information such as the number of mute/unmute actions, duration of muting, and frequency of muting events from sessions originating from various locations. Using machine learning algorithms, the AI/ML module identifies patterns and trends in muting behavior across nations. For example, it may detect that users in certain regions mute their microphones more frequently during specific times of day or during particular types of meetings. Based on the aggregated data and identified patterns, the AI/ML module generates a PAN summary. This summary could include comparative metrics between nations, highlighting differences in muting behavior, and providing insights into regional preferences or cultural norms related to muting in communication sessions. The PAN summary may be presented to end users through visualizations such as charts, graphs, or geographical heatmaps. This makes it easier for users to understand and interpret the data at a glance.
[0082] In another embodiment of the present subject matter, the UPF (108) may implement all functions of the troubleshooting platform (450) in the above embodiments and generate report of call muting for each cell identity. The UPF (108) may automatically generate presence across nation (PAN) summary and user level summary and provides to the end user. In another embodiment of the present subject matter, the AI/ML module may be embedded in the UPF (108).
[0083] Further in accordance with another aspect the UPF (108) may be embedded by way of example but not limitation, in one or more of a standalone server, a server blade, a server rack, a bank of servers, a server farm, hardware supporting a part of a cloud service or system, a home server, hardware running a virtualized server, one or more processors executing code to function as a server, one or more machines performing server-side functionality as described herein, at least a portion of any of the above, some combination thereof.
[0084] In one embodiment of the present invention, the server may include or comprise, by way of example but not limitation, one or more of a standalone server, a server blade, a server rack, a bank of servers, a server farm, hardware supporting a part of a cloud service or system, a home server, hardware running a virtualized server, one or more processors executing code to function as a server, one or more machines performing server-side functionality as described herein, at least a portion of any of the above, some combination thereof. In an embodiment, the entity may include, but is not limited to, a vendor, a network operator, a company, an organization, a university, a lab facility, a business enterprise, a defense facility, or any other facility that provides content.
[0085] FIG. 5 is a flow chart (500) illustrating a method for automatically identifying and mitigating call muting events in the core network, according to various embodiments of the present system.
[0086] At step 502, the method includes detecting the one or more muting events from the plurality of voice calls passing through the UPF (108) in the core network. At step 504, the method includes obtaining the muting timestamp array for each of the one or more muting events in the voice call of the plurality of voice calls. At step 506, the method includes obtaining the timestamp flow creation, the session identification for the one or more muting events from the UPF (108) for each of the muting event.
[0087] At step 508, the method includes obtaining the record (e.g., PDR or the like) corresponding to the timestamp flow creation and the session identification from the SMF module (420) for each of the muting event. At step 510, the method includes obtaining the cell identification of each UE (102) participating in the voice call from the SMF module (420). At step 512, the method includes obtaining the party details from the IMS (440) for each identified muting event.
[0088] At step 514, the method includes determining the muting duration, the party KPIs with muting times for each identified muting event. At step 516, the method includes generating the report including the region summary and the user level summary based on the timestamp flow creation, the muting timestamp array, the session identification, the cell identification and the determination of the muting duration, and the party KPIs. The mitigation for the muting events is performed with the help of the report.
[0089] FIG. 6 illustrates an example flow chart (600) for a method for automatically identifying and mitigating the call muting events in the core network, in accordance with the another embodiment of the present disclosure.
[0090] At step 602, the method includes identifying the plurality of “Muting Timestamp Array” to recover the number of muting events for the plurality of calls. At step 604, the method includes obtaining the "Timestamp Flow Creation" and session id of mute call form UPF (108) for each identified muting call. At step 606, the method includes obtaining the record corresponding to "Timestamp Flow Creation" and the session ID from the SMF module (420) for each identified muting call. At step 608, the method includes obtaining the cellID from the SMF module for each identified muting call.
[0091] At step 610, the method includes obtaining the B party details such as B-party number, telecom operator, equipment model from the CTAS via the IMS for each identified muting call. At step 612, the method includes determining the cell identity, muting duration, sender/receiver party KPI’s with muting times for each identified muting call. At step 614, the method includes generating a report of call muting for each cell identity. At step 616, the method may further comprise generating PAN summary and user level summary. At step 618, the method includes providing the same to the end user.
[0092] FIG. 7 is an example sequence diagram (700) illustrating a method for automatically identifying the call muting in the core network in accordance with another embodiment of the present disclosure.
[0093] At step 1, the troubleshooting platform (450) gets “muting timestamp array” to recover the number of muting events for all the calls from the UPF (108). At step 2, the troubleshooting platform (450) gets the “timestamp flow creation” and the “session identification” for the muting call from the UPF (108).
[0094] At step 3, the troubleshooting platform (450) gets the record corresponding to the timestamp flow creation and the session identification from the SMF module (420). At step 4, the troubleshooting platform (450) gets the cell ID from the SMF module (420). At step 5, the troubleshooting platform (450) gets the party number and its details from the IMS (440).
[0095] At step 6, the troubleshooting platform (450) computes the cell identity, the muting duration, the party KPIs with muting times for each identified muting call. Step 2-6 are performed for each call in the form of loop.
[0096] At step 7, the troubleshooting platform (450) generates the summary including the region summary and the user level summary based on the timestamp flow creation, the muting timestamp array, the session identification, the cell identification and the determination of the muting duration, and the party KPIs. At step 8, the troubleshooting platform (450) shares the region summary and the user level summary to the user.
[0097] The present invention provides a solution for the muting detection for all voice calls in the core 5G network and for mitigating the muting events in the calls to provide seamless and uninterrupted voice call experiences in the 5G networks. In the present method, the detection of call muting is not dependent upon the eNB/gNB logs and analysis. Further, the system (110) reduces the delay for muting analysis when compared to the prior methods including Voice Measurement and Analysis (VoMA). Further, non-availability of analysis due to sampling in handovers is avoided in the present invention.
[0098] The present invention relates to a system and a method for automatically identifying and mitigating the call muting in a core network. The system (110) may comprise an AI/ML module for automatically identifying the call muting event and mitigating the same.
[0099] A person of ordinary skill in the art will readily ascertain that the illustrated embodiments and steps in description and drawings (FIGS. 1-6) are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[00100] Method steps: A person of ordinary skill in the art will readily ascertain that the illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[00101] The present invention offers multiple advantages over the prior art and the above listed are a few examples to emphasize on some of the advantageous features. The listed advantages are to be read in a non-limiting manner.
REFERENCE NUMERALS
[00102] Environment - 100
[00103] UEs– 102, 102-1-102-n
[00104] Server - 104
[00105] Communication network – 106
[00106] UPF – 108
[00107] System - 110
[00108] Processor – 202
[00109] Memory – 204
[00110] User Interface – 206
[00111] Display – 208
[00112] Input device – 210
[00113] Centralized Database – 214
[00114] Muting events controller – 216
[00115] System - 300
[00116] Primary processors -305
[00117] Memory– 310
[00118] Kernel– 315
[00119] System - 400
[00120] Base station – 410
[00121] SMF module – 420
[00122] PCF – 430
[00123] IMS – 440
[00124] Troubleshooting platform – 450
[00125] Northbound Interface (NBI) - 460
,CLAIMS:CLAIMS:
We Claim:
1. A method of identifying and reporting for muting events in a core network, the method comprising the steps of:
detecting, by a processor (202), one or more muting events from a plurality of voice calls passing through a user plane function (UPF) (108) in the core network;
obtaining, by the processor (202), one or more muting timestamps for each of the one or more muting events in the voice call of the plurality of voice calls;
obtaining, by the processor (202), a timestamp flow creation, a session identification for the one or more muting events from the UPF (108) for each of the muting event;
obtaining, by the processor (202), a record corresponding to the timestamp flow creation and the session identification from a Session Management function (SMF) module (420) for each of the muting event;
obtaining, by the processor (202), a cell identification of each user equipment (UE) (102) participating in the voice call from the SMF module (420);
obtaining, by the processor (202), party details from an Internet Protocol (IP) Multimedia Subsystem (IMS) for each identified muting event;
determining, by the processor (202), muting duration, party key performance indicators (KPIs) with muting times for each identified muting event; and
generating, by the processor (202), a report comprising a region summary and a user level summary based on the timestamp flow creation, the one or more muting timestamps, the session identification, the cell identification and the determination of the muting duration, and the party KPIs.
2. The method as claimed in claim 1, comprising identifying, by the processor (202), a call hold status of the voice call based on the direction of the muting detected from the UPF (108).
3. The method as claimed in claim 1, comprising transmitting, by the processor (202), steaming data records (SDR) upon termination of the voice call.
4. The method as claimed in claim 1, comprising correlating a direction of the steaming data records (SDR) from each of the user equipment (102) for the muting events in the voice calls.
5. The method as claimed in claim 1, comprising determining, by the processor (202), a threshold for detecting the muting events in the voice call.
6. The method as claimed in claim 1, comprising obtaining, by the processor (202), the timestamp flow creation and a timestamp flow deletion from the SMF module (420), the timestamp flow creation indicating a timestamp at which a dedicated Service Data Flow (SDF) is created, the timestamp flow deletion indicating a timestamp at which the dedicated Service Data Flow (SDF) is deleted.
7. The method as claimed in claim 1, the step of detecting the muting events comprising:
detecting, by the processor (202), packet loss, packet delay, frequent radio link failures, high Radio Resource Control (RCC) connection reestablishment, a handover oscillation, or frequent handovers.
8. The method as claimed in claim 1, wherein the user identity comprises one of Subscription Permanent Identifier (SUPI), General Public Subscription Identifier (GPSI) and a Permanent Equipment Identifier (PEI).
9. A system (110) for identifying and reporting for muting events in a core network, the system (110) comprising:
a memory (204); and
a processor (202) coupled to the memory (204), wherein the processor (202) is configured to execute program instructions stored in the memory (204) to:
detect one or more muting events from a plurality of voice calls passing through a user plane function (UPF) (108) in the core network;
obtain one or more muting timestamps for each of the one or more muting events in the voice call of the plurality of voice calls;
obtain a timestamp flow creation, a session identification for the one or more muting events from the UPF (108) for each of the muting event;
obtain a record corresponding to the timestamp flow creation and the session identification from a Session Management function (SMF) module (420) for each of the muting event;
obtain a cellular identification of each user equipment (UE) (102) participating in the voice call from the SMF module (420);
obtain party details from an Internet Protocol (IP) Multimedia Subsystem (IMS) for each identified muting event;
determine cell identity, muting duration, party key performance indicators (KPIs) with muting times for each identified muting event; and
generate a report comprising a region summary and a user level summary based on the timestamp flow creation, the one or more muting timestamps, the session identification, the cellular identification and the determination of the cell identity, the muting duration, the parties KPIs.
10. The system (110) as claimed in claim 9, wherein the UPF (108) identifies a call hold status of the voice call based on a direction of the muting detected.
11. The system (110) as claimed in claim 9, wherein the UPF (108) transmits steaming data records (SDR) upon termination of the voice call.
12. The system (110) as claimed in claim 9, wherein the UPF (108) correlates a direction of the steaming data records (SDR) from each of the UE (102) for the muting events in the voice calls.
13. The system (110) as claimed in claim 9, wherein the UPF (108) determines a threshold for detecting the muting events in the voice call.
14. The system (110) as claimed in claim 9, wherein the UPF (108) obtains a timestamp flow deletion from the SMF module (420), wherein the timestamp flow deletion indicates a timestamp at which the dedicated SDF is deleted.
15. The system (110) as claimed in claim 9, wherein the muting events are detected based on packet loss, packet delay, frequent radio link failures, high Radio Resource Control (RCC) connection reestablishment, a handover oscillation, or frequent handovers.
16. The system (100) as claimed in claim 9, wherein the user identity comprises one of Subscription Permanent Identifier (SUPI), General Public Subscription Identifier (GPSI) and a Permanent Equipment Identifier (PEI).
| # | Name | Date |
|---|---|---|
| 1 | 202321048151-STATEMENT OF UNDERTAKING (FORM 3) [17-07-2023(online)].pdf | 2023-07-17 |
| 2 | 202321048151-PROVISIONAL SPECIFICATION [17-07-2023(online)].pdf | 2023-07-17 |
| 3 | 202321048151-FORM 1 [17-07-2023(online)].pdf | 2023-07-17 |
| 4 | 202321048151-FIGURE OF ABSTRACT [17-07-2023(online)].pdf | 2023-07-17 |
| 5 | 202321048151-DRAWINGS [17-07-2023(online)].pdf | 2023-07-17 |
| 6 | 202321048151-DECLARATION OF INVENTORSHIP (FORM 5) [17-07-2023(online)].pdf | 2023-07-17 |
| 7 | 202321048151-FORM-26 [03-10-2023(online)].pdf | 2023-10-03 |
| 8 | 202321048151-Proof of Right [08-01-2024(online)].pdf | 2024-01-08 |
| 9 | 202321048151-DRAWING [16-07-2024(online)].pdf | 2024-07-16 |
| 10 | 202321048151-COMPLETE SPECIFICATION [16-07-2024(online)].pdf | 2024-07-16 |
| 11 | Abstract-1.jpg | 2024-09-04 |
| 12 | 202321048151-Power of Attorney [25-10-2024(online)].pdf | 2024-10-25 |
| 13 | 202321048151-Form 1 (Submitted on date of filing) [25-10-2024(online)].pdf | 2024-10-25 |
| 14 | 202321048151-Covering Letter [25-10-2024(online)].pdf | 2024-10-25 |
| 15 | 202321048151-CERTIFIED COPIES TRANSMISSION TO IB [25-10-2024(online)].pdf | 2024-10-25 |
| 16 | 202321048151-FORM 3 [06-12-2024(online)].pdf | 2024-12-06 |
| 17 | 202321048151-FORM 18A [18-03-2025(online)].pdf | 2025-03-18 |
| 18 | 202321048151-FER.pdf | 2025-05-29 |
| 19 | 202321048151-OTHERS [17-06-2025(online)].pdf | 2025-06-17 |
| 20 | 202321048151-FORM-5 [17-06-2025(online)].pdf | 2025-06-17 |
| 21 | 202321048151-FER_SER_REPLY [17-06-2025(online)].pdf | 2025-06-17 |
| 1 | 202321048151_SearchStrategyNew_E_searchstrategyE_17-04-2025.pdf |