Sign In to Follow Application
View All Documents & Correspondence

Method And System For Facilitating Data Modelling In Inventory Management

Abstract: ABSTRACT METHOD AND SYSTEM FOR FACILITATING DATA MODELLING IN INVENTORY MANAGEMENT The present disclosure relates to a system (125) and a method (500) for facilitating data modelling in an Inventory Management (230). The system (125) includes an external interface (220) configured to send a data modelling request to the IM (230) by a user. The system (125) further includes an AI/ML module (225) embedded, or communicably connected to the IM (230). The AI/ML module (225) is configured to generate an at least one recommended data model. The at least one recommended data model is forwarded to the user. The IM (230) is further configured to receive an approval on the at least one recommended data model, or an amended data model to enhance user experience and reduce time and manual effort. Ref. Fig. 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
12 July 2023
Publication Number
03/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

JIO PLATFORMS LIMITED
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad

Inventors

1. Aayush Bhatnagar
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
2. Ankit Murarka
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
3. Rizwan Ahmad
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
4. Kapil Gill
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
5. Rahul Verma
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
6. Arpit Jain
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
7. Shashank Bhushan
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
8. Kamal Malik
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
9. Chaitanya V Mali
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
10. Supriya De
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
11. Kumar Debashish
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
12. Tilala Mehul
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad
13. Kothagundla Vinay Kumar
Office-101, Saffron, Nr. Centre Point, Panchwati 5 Rasta, Ambawadi, Ahmedabad

Specification

DESC: FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003

COMPLETE SPECIFICATION
(See section 10 and rule 13)
1. TITLE OF THE INVENTION
METHOD AND SYSTEM FOR FACILITATING DATA MODELLING IN INVENTORY MANAGEMENT
2. APPLICANT(S)
NAME NATIONALITY ADDRESS
JIO PLATFORMS LIMITED INDIAN OFFICE-101, SAFFRON, NR. CENTRE POINT, PANCHWATI 5 RASTA, AMBAWADI, AHMEDABAD 380006, GUJARAT, INDIA
3.PREAMBLE TO THE DESCRIPTION

THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE NATURE OF THIS INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.

FIELD OF THE INVENTION
[0001] The present invention generally relates to wireless communication systems, and more particularly relates to a method and system for facilitating data modelling in Inventory Management (IM).
BACKGROUND OF THE INVENTION
[0002] Inventory Management (IM) is an open, standards-based, cloud-native application that provides an intelligent inventory of communications services and resources – from 5G to Fiber – across physical and virtual network technologies.
[0003] IM is a standards-based telecommunications inventory management application that enables you to model and manage customers, services, and resources. IM supports complex business relationships and provides full life-cycle management of services and resources.
[0004] Data modelling is the process of creating a visual representation of either a whole information system or parts of it to communicate connections between data points and structures. It is a process of creating a conceptual representation of data objects and their relationships to one another. The process of data modelling typically involves several steps, including requirements gathering, conceptual design, logical design, physical design, and implementation. During each step of the process, data modelers work with stakeholders to understand the data requirements, define the entities and attributes, establish the relationships between the data objects, and create a model that accurately represents the data in a way that can be used by application developers, database administrators, and other stakeholders.
[0005] In the prior art, while creating a data model, the user is generally required to feed all the attributes including data types, etc. The user is required to manually feed all the attributes. This is time consuming. Also, this becomes repetitive and creates duplicity of efforts and results too.
[0006] There is therefore a need for a solution that overcomes the above challenges and provides a system and method for creating models more efficiently by reducing the time and manual effort required.
BRIEF SUMMARY OF THE INVENTION
[0007] One or more embodiments of the present disclosure provide a method and system for facilitating data modelling in an Inventory Management (IM).
[0008] In one aspect of the present invention, a method for facilitating data modelling in an Inventory Management (IM) is disclosed. The method includes the step of receiving by one or more processors a data modelling request to the IM from a user via an external interface. The method includes the step of forwarding by the one or more processors the data modelling request to an Artificial Intelligence (AI)/Machine Learning (ML) module to obtain an at least one recommended data model. The method further includes the step of receiving by the one or more processors the at least one recommended data model from the AI/ML module. The method further includes the step of forwarding by the one or more processors the at least one recommended data model to the user. The method further includes the step of receiving by the one or more processors an approval on the at least one recommended data model, or an amended data model.
[0009] In one embodiment, the method further includes receiving by the one or more processors a change request to the at least one recommended data model from the user.
[0010] In another embodiment, the method further includes amending by the one or more processors the at least one recommended data model based on the change request received from the user to obtain the amended model.
[0011] In yet another embodiment, the method further includes training by the one or more processors the AI/ML model embedded in the IM, or the AI/ML module, on data models from all kinds of network inventory to predict a model/table structure like attributes, or data types, or any constraints associated with the attributes.
[0012] In yet another embodiment, the method further includes suggesting by the one or more processors at least one of the model/table structures like attribute names, or the data types, or the constraints applied on the attribute at the time of data modelling in the IM.
[0013] In another aspect of the present invention, a system for facilitating data modelling in an Inventory Management (IM). The system includes an external interface configured to send a data modelling request to the IM by a user. The system includes an AI/ML module embedded or communicably connected to the IM. The AI/ML module is configured to generate an at least one recommended data model. The at least one recommended data model is forwarded to the user. The system includes the IM further configured to receive an approval on the at least one recommended data model, or an amended data model.
[0014] Other features and aspects of this invention will be apparent from the following description and the accompanying drawings. The features and advantages described in this summary and in the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the relevant art, in view of the drawings, specification, and claims hereof. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes and may not have been selected to delineate or circumscribe the inventive subject matter, resort to the claims being necessary to determine such inventive subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components, electronic components or circuitry commonly used to implement such components.
[0016] FIG. 1 is an exemplary block diagram of an environment for facilitating data modelling in an Inventory Management (IM), according to one or more embodiments of the present disclosure;
[0017] FIG. 2 is an exemplary block diagram of a system for facilitating data modelling in the IM, according to the one or more embodiments of the present disclosure;
[0018] FIG. 3 is a schematic representation of the present system of FIG. 1 workflow, according to the one or more embodiments of the present disclosure;
[0019] FIG. 4 is a signal flow diagram illustrating facilitating data modelling in the IM, according to the one or more embodiments of the present disclosure; and
[0020] FIG. 5 is a flow diagram illustrating a method for facilitating data modelling in an Inventory Management (IM), according to the one or more embodiments of the present disclosure.
[0021] The foregoing shall be more apparent from the following detailed description of the invention.
DETAILED DESCRIPTION OF THE INVENTION
[0022] Some embodiments of the present disclosure, illustrating all its features, will now be discussed in detail. It must also be noted that as used herein and in the appended claims, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise.
[0023] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure including the definitions listed here below are not intended to be limited to the embodiments illustrated but is to be accorded the widest scope consistent with the principles and features described herein.
[0024] A person of ordinary skill in the art will readily ascertain that the illustrated steps detailed in the figures and here below are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0025] As per various embodiments depicted, the present invention discloses the system and method of handling secure data for at least one User Equipment (UE) to securely and sensitively store, handle and manage security certificates and credentials.
[0026] Referring to FIG. 1, FIG. 1 illustrates an exemplary block diagram of an environment 100 for facilitating data modelling in an Inventory Management (IM) 230 (as shown in FIG.2), according to one or more embodiments of the present invention. The environment 100 includes a network 105, a UE 110, a server 115, and a system 125. The UE 110 aids a user to interact with the system 125 for transmitting a data modelling request to the IM 230 from the user via an external interface 220 (as shown in FIG.2).
[0027] For the purpose of description and explanation, the description will be explained with respect to one or more UEs 110, or to be more specific will be explained with respect to a first UE 110a, a second UE 110b, and a third UE 110c, and should nowhere be construed as limiting the scope of the present disclosure. In one embodiment, each of the first UE 110a, the second UE 110b, and the third UE 110c is one of, but are not limited to, any electrical, electronic, electro-mechanical or an equipment and a combination of one or more of the above devices such as virtual reality (VR) devices, augmented reality (AR) devices, laptop, a general-purpose computer, desktop, personal digital assistant, tablet computer, mainframe computer, or any other computing device.
[0028] Each of the first UE 110a, the second UE 110b, and the third UE 110c is further configured to host one or more applications thereon. Each of the one or more applications is adapted to include one or more applications stacks to aid in performing certain predefined activities of each of the one or more applications. The predefined activities include, but not limited to, accessing the server 115, and transmitting the data modelling request to the one or more applications via the network 105.
[0029] A person skilled in the art will appreciate that the UE 110 may include more than one processor and communication ports. The communication port(s) may be any of an RS-232 port for use with a modem-based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or future ports. The communication port(s) may be chosen depending on a network, such as, but not limited to, a Local Area Network (LAN), a Wide Area Network (WAN), or any of the network to which the computer system connects.
[0030] The network 105 includes, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, or some combination thereof. The network 105 may include, but is not limited to, a Third Generation (3G), a Fourth Generation (4G), a Fifth Generation (5G), a Sixth Generation (6G), a New Radio (NR), a Narrow Band Internet of Things (NB-IoT), an Open Radio Access Network (O-RAN), and the like.
[0031] The server 115 may include by way of example but not limitation, one or more of a standalone server, a server blade, a server rack, a bank of servers, a server farm, hardware supporting a part of a cloud service or system, a home server, hardware running a virtualized server, one or more processors executing code to function as a server, one or more machines performing server-side functionality as described herein, at least a portion of any of the above, some combination thereof. In an embodiment, the entity may include, but is not limited to, a vendor, a network operator, a company, an organization, a university, a lab facility, a business enterprise, a defence facility, or any other facility that provides content.
[0032] The environment 100 further includes the system 125 communicably coupled to the server 115 and each of the first UE 110a, the second UE 110b, and the third UE 110c via the network 105. The system 125 is adapted to be embedded within the server 115 or is embedded as the individual entity, as per multiple embodiments of the present invention. However, for the purpose of description, the system 125 is described as an integral part of the server 115, without deviating from the scope and limiting the scope of the present disclosure.
[0033] Operational and construction features of the system 125 will be explained in detail with respect to the following figures.
[0034] Referring to FIG. 2, FIG. 2 illustrates an exemplary block diagram of the system 125 for facilitating data modelling in the IM 230, according to the one or more embodiments of the present disclosure. The system 125 includes a processor 205, and a memory 210. For the purpose of description and explanation, the description will be explained with respect to one or more processors 205, or to be more specific will be explained with respect to the processor 205 and should nowhere be construed as limiting the scope of the present disclosure. The one or more processor 205, hereinafter referred to as the processor 205 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, single board computers, and/or any devices that manipulate signals based on operational instructions.
[0035] The information related to facilitating the data modelling in the IM 230 may be provided or stored in the memory 210. As per the illustrated embodiment, the processor 205 is configured to fetch and execute computer-readable instructions stored in the memory 210. The memory 210 may be configured to store one or more computer-readable instructions or routines in a non-transitory computer-readable storage medium, which may be fetched and executed to create or share data packets over a network service. The memory 210 may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0036] The database 240 is configured to store the data modelling request from the UE 110 to the IM 230. The database 240 is one of, but not limited to, a centralized database, a cloud-based database, a commercial database, an open-source database, a distributed database, an end-user database, a graphical database, a No-Structured Query Language (NoSQL) database, an object-oriented database, a personal database, an in-memory database, a document-based database, a time series database, a wide column database, a key value database, a search database, a cache databases, and so forth. The foregoing examples of database 240 types are non-limiting and may not be mutually exclusive e.g., a database can be both commercial and cloud-based, or both relational and open-source, etc.
[0037] Further, the processor 205, in an embodiment, may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor 205. In the examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor 205 may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for processor 205 may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the memory 210 may store instructions that, when executed by the processing resource, implement the processor 205. In such examples, the system 125 may comprise the memory 210 storing the instructions and the processing resource to execute the instructions, or the memory 210 may be separate but accessible to the system 125 and the processing resource. In other examples, the processor 205 may be implemented by electronic circuitry.
[0038] In order for the system 125 to facilitate data modelling in the IM 230. The processor 205 includes an external interface 220, an Artificial Intelligence/Machine Learning (AI/ML) module 225, and the IM 230 communicably coupled to each other.
[0039] The external interface 220 is configured to send the data modelling request to the IM 230 by a user. In an embodiment, the external interface 220 includes, but not limited to a Graphical User Interface (GUI), a web user interface, a Command Line Interface (CLI), or a third-party GUI using an Application Programming Interface (API). The API is defined as medium of communication in between the UE 110 and the server 115. The APIs define data formats that applications can use to request and exchange information. When the data modelling request is received by the IM 230, the IM 230 transmits the data modelling request to the AI/ML module 225 to obtain an at least one recommended model.
[0040] The AI/ML module 225 is embedded or communicably connected to the IM 230. More specifically, the AI/ML module 225 is configured to generate the at least one recommended data model. In an embodiment, the generated at least one recommended model includes at least one of attribute names, data types, or the constraints applied on the attribute at the time of data modelling. The AI/ML module 225 is trained on a plurality of data models that exist in a network inventory, an assets inventory, an IP & logical inventory to predict the attribute names, the data types, or the constraints applied on the attribute at the time of data modelling in the IM 230 and provides the data modelling nearly all the components existing in an End-to-End (E2E) network. The E2E network refers to a complete and integrated network that spans the entire communication path from a source to a destination. The AI/ML module 225 is trained on the plurality of data models from all kinds of the network inventory to predict the model or table structure like attributes, or data types, or any constraints associated with the attributes.
[0041] In an embodiment, the AI/ML module 225 is simultaneously trained by receiving a new data modelling request by various other users within the network and generating the at least one recommended data model based upon the requirements of the data modelling request which is fed into the system 125 by another user. The at least one recommended data model is forwarded to the user by the AI/ML module 225. The user utilizes the at least one recommended data model or discards the at least one recommended data model being given by the AI/ML module 225 in real time. The user also adds attributes to a new model beyond the at least one recommended data model being made. The trained AI/ML module 225 is utilized to predict the structure of a new model. The trained AI/ML module 225 is configured to predict the attributes, the data types, and the constraints for new models in real-time. The AI/ML module 225 utilizes a variety of ML techniques, such as supervised learning, unsupervised learning, generative learning, and reinforcement learning.
[0042] In one embodiment, the supervised learning is a type of machine learning algorithm, which is trained on a labeled dataset. The supervised learning refers that each training example paired with an output label. The supervised learning algorithm learns to map inputs to a correct output. In one embodiment, the unsupervised learning is a type of machine learning model, which is trained on data without any labels. The unsupervised learning algorithm tries to learn the underlying structure or distribution in the data in order to discover patterns or groupings. In one embodiment, the reinforcement learning is a type of machine learning where an agent learns to make decisions by performing actions in an environment to maximize cumulative reward. The agent receives feedback in the form of rewards or penalties based on the actions it takes, and it learns a path that maps states of the environment to the best actions.
[0043] The IM 230 is further configured to receive an approval on the at least one recommended data model, or an amended data model. In an embodiment, the at least one recommended data model receives the change request from the user to obtain the amended data model. The user drafts the new model based upon the at least one recommended data model and sends a request pertaining to create the data modelling request to the IM 230. Finally, the IM 230 creates the new data model with the at least one recommended data model. By doing so, the system 125 provides for reducing the time and manual effort required for creating data models in the IM 230 thereby reducing processing requirements and enhancing overall user experience.
[0044] Referring to FIG. 3, FIG. 3 describes the system 125 for facilitating data modelling in the IM 230, according to the one or more embodiments of the present disclosure. It is to be noted that the embodiment with respect to FIG. 3 will be explained with respect to the first UE 110a for the purpose of description and illustration and should nowhere be construed as limited to the scope of the present disclosure.
[0045] As mentioned earlier in FIG.1, each of the first UE 110a, the second UE 110b, and the third UE 110c may include an external storage device, a bus, a main memory, a read-only memory, a mass storage device, communication port(s), and a processor. The exemplary embodiment as illustrated in the FIG. 3 will be explained with respect to the first UE 110a. The first UE 110a includes one or more primary processors 305 communicably coupled to the one or more processors 205 of the system 125.
[0046] The one or more primary processors 305 are coupled with a memory 310 storing instructions which are executed by the one or more primary processors 305. Execution of the stored instructions by the one or more primary processors 305 enables the first UE 110a to transmit the data modelling request to the IM 230 from the user.
[0047] As mentioned earlier in FIG.2, the one or more processors 205 of the system 125 is configured to receive the data modelling request, forwarding the data modelling request to the AI/ML module 225, receiving the at least one recommended data model from the AI/ML module 225, forwarding the at least one recommended data model to the user, and receiving the approval on the at least one recommended data model, or the amended data model.
[0048] As per the illustrated embodiment, the system 125 includes the one or more processors 205, and the memory 210. The operations and functions of the one or more processors 205, and the memory 210 are already explained in FIG. 2. For the sake of brevity, a similar description related to the working and operation of the system 125 as illustrated in FIG. 2 has been omitted to avoid repetition.
[0049] Further, the processor 205 includes the external interface 220, the AI/ML module 225, and the IM 235. The operations and functions of the external interface 220, the AI/ML module 225, and the IM 235 are already explained in FIG. 2. Hence, for the sake of brevity, a similar description related to the working and operation of the system 125 as illustrated in FIG. 2 has been omitted to avoid repetition. The limited description provided for the system 125 in FIG. 3, should be read with the description as provided for the system 125 in the FIG. 2 above, and should not be construed as limiting the scope of the present disclosure.
[0050] FIG. 4 is a signal flow diagram illustrating facilitating the data modelling in the IM 230, according to the one or more embodiments of the present disclosure.
[0051] At step 402, the external interface 220 transmits the data modelling request to the IM 230 from the user. When the data modelling request is received by the IM 230 via the external interface 220, the IM 230 transmits the data modelling request to the AI/ML module 225 to obtain an at least one recommended model.
[0052] At step 404, the IM 230 forwards the data modelling request to the AI/ML module 225 to obtain the at least one recommended data model. The AI/ML module 225 is embedded or communicably connected to the IM 230. The at least one recommended data model is generated from the AI/ML module 225.
[0053] At step 406, the AI/ML module 225 forwards the at least one generated recommended data model to the user. The user utilizes the at least one recommended data model or discards the at least one recommended data model being given by the AI/ML module 225 in real time. The user also adds attributes to a new model beyond the at least one recommended data model being made.
[0054] At step 408, the IM 230 receives an approval on the at least one recommended data model, or an amended data model from the user. In an embodiment, the at least one recommended data model receives the change request from the user to obtain the amended data model. The user finally the new model based upon the at least one recommended data model and transmits create the data modelling request to the IM 230. Finally, the IM 230 creates the new data model utilizing the at least one recommended data model.
[0055] FIG. 5 is a flow diagram of a method 500 for facilitating data modelling in the IM 230, according to the one or more embodiments of the present disclosure. The method 500 is adapted to facilitate data modelling in the IM 230. For the purpose of description, the method 500 is described with the embodiments as illustrated in FIG. 2 and should nowhere be construed as limiting the scope of the present disclosure.
[0056] At step 505, the method 500 includes the step of receiving a data modelling request from the user to the IM 230 by an external interface 220. In an embodiment, the external interface 220 includes, but not limited to a Graphical User Interface (GUI), a web user interface, a Command Line Interface (CLI), or a third-party GUI using an Application Programming Interface (API). When the data modelling request is received by the IM 230, the IM 230 transmits the data modelling request to the AI/ML module 225 to obtain an at least one recommended model.
[0057] At step 510, the method 500 includes the step of forwarding the data modelling request to the AI/ML module 225 to obtain the at least one recommended data model. The AI/ML module 225 is embedded or communicably connected to the IM 230. The AI/ML module 225 is trained on the plurality of data models from all kinds of the network inventory to predict the model or table structure like attributes, or data types, or any constraints associated with the attributes.
[0058] At step 515, the method 500 includes the step of receiving the at least one recommended data model from the AI/ML module 225. In an embodiment, the received at least one recommended model includes at least one of attribute names, data types, or the constraints applied on the attribute at the time of data modelling. The AI/ML module 225 is trained on a plurality of data models that exist in network inventory, assets inventory, IP & logical inventory to predict the attribute names, the data types, or the constraints applied on the attribute at the time of data modelling in the IM 230 and provides the data modelling nearly all the components existing in an End-to-End (E2E) network.
[0059] In an embodiment, the AI/ML module 225 is simultaneously trained by receiving a new data modelling request by various other users within the network and generating the at least one recommended data model based upon the requirements of the data modelling request which is fed into the system 125 by another user.
[0060] At step 520, the method 500 includes the step of forwarding the at least one recommended data model to the user by the AI/ML module 225. The user utilizes the at least one recommended data model or discards the at least one recommended data model being given by the AI/ML module 225 in real time. The user also adds attributes to a new model beyond the at least one recommended data model being made.
[0061] At step 525, the method 500 includes the step of receiving an approval on the at least one recommended data model, or an amended data model by the IM 230. In an embodiment, the at least one recommended data model receives the change request from the user to obtain the amended data model. The user finally drafts the new model based upon the at least one recommended data model and sends create the data modelling request to the IM 230. Finally, the IM 230 created the new data model with the at least one recommended data model. By doing so, the method 500 provides for reducing the time and manual effort required for creating data models in the IM 230 thereby reducing processing requirements and enhancing overall user experience.
[0062] The present invention further discloses a non-transitory computer-readable medium having stored thereon computer-readable instructions. The computer-readable instructions are executed by a processor 205. The processor 205 is configured to receive a data modelling request by an IM 230 from a user via an external interface 220. The processor 205 is further configured to forward the data modelling request to an AI/ML module 225 to obtain an at least one recommended data model. The processor 205 is further configured to receive the at least one recommended data model from the AI/ML module 225. The processor 205 is further configured to forward the at least one recommended data model to the user. The processor 205 is further configured to receive an approval on the at least one recommended data model, or an amended data model.
[0063] A person of ordinary skill in the art will readily ascertain that the illustrated embodiments and steps in description and drawings (FIG.1-5) are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0064] The present disclosure incorporates technical advancement of efficiently creating the data models by reducing the time and manual effort required. The disclosure also provides enhanced end-user experience based on the at least one recommended data model since in a very less time and effort, the user can model large amount of data accurately with less probable mistakes. Furthermore, the disclosure also makes the experience more interactive.
[0065] The present disclosure incorporates the advantage of the training the data models from all kinds of network inventory and predicts the at least one recommended model/table structure such as attributes, data types, any constraints that is placed on the attribute and the like. So, the at least one recommended model can reduce time and efforts of end user thereby enhancing the overall user experience.
[0066] The present invention offers multiple advantages over the prior art and the above listed are a few examples to emphasize on some of the advantageous features. The listed advantages are to be read in a non-limiting manner.

REFERENCE NUMERALS
[0067] Environment - 100;
[0068] Network - 105;
[0069] User Equipment – 110;
[0070] Server – 115;
[0071] System -125;
[0072] Processor -205;
[0073] Memory – 210;
[0074] External Interface - 220;
[0075] AI/ML Module - 225;
[0076] IM - 230;
[0077] Database - 240;
[0078] Primary processor– 305;
[0079] Memory unit- 310.

,CLAIMS:CLAIMS
We Claim:
1. A method (500) for facilitating data modelling in an Inventory Management (IM) (230), the method (500) comprising the steps of:
receiving (505), by one or more processors (205), a data modelling request to the IM (230) from a user via an external interface (220);
forwarding (510), by the one or more processors (205), the data modelling request to an Artificial Intelligence/ Machine Learning (AI/ML) module (225) to obtain at least one recommended data model;
receiving (515), by the one or more processors (205), the at least one recommended data model from the AI/ML module (225);
forwarding (520), by the one or more processors (205), the at least one recommended data model to the user; and
receiving (525), by the one or more processors (205), an approval on the at least one recommended data model, or an amended data model.

2. The method (500) as claimed in claim 1, comprises receiving, by the one or more processors (205), a change request to the at least one recommended data model from the user.

3. The method (500) as claimed in claim 2, comprises amending, by the one or more processors (205), the at least one recommended data model based on the change request received from the user to obtain the amended model.

4. The method (500) as claimed in claim 2, comprises training, by the one or more processors (205), the AI/ML module (225) embedded in the IM (230), or the AI/ML module (225), on data models from all kinds of network inventory to predict a model or table structure like attributes, or data types, or any constraints associated with the attributes.
5. The method (500) as claimed in claim 1, wherein receiving comprises suggesting, by the one or more processors (205), at least one of the model or table structures like attribute names, or the data types, or the constraints applied on the attribute at the time of data modelling in the IM (230).

6. A system (125) for facilitating data modelling in an Inventory Management (IM) (230), the system (125) comprising:
an external interface (220), configured to send a data modelling request to the IM (230) by a user; and
an AI/ML module (225) embedded, or communicably connected to the IM (230), wherein the AI/ML module (225) is configured to generate an at least one recommended data model;
wherein the at least one recommended data model is forwarded to the user;
wherein the IM (230), is further configured to receive an approval on the at least one recommended data model, or an amended data model.

7. The system (125) as claimed in claim 6, the at least one recommended data model is configured to receive a change request from the user.

8. The system (125) as claimed in claim 6, the at least one recommended data model is amended based on the change request received from the user to obtain the amended model.

9. The system (125) as claimed in claim 6, wherein the AI/ML model embedded in the IM (230), or the AI/ML module (225), is trained on a data model from all kinds of network inventory to predict a model/table structure like attributes, or data types, or any constraints associated with the attributes.

10. The system (125) as claimed in claim 6, wherein the generated at least one recommended model comprises at least one of attribute names, or the data types, or the constraints applied on the attribute at the time of data modelling.

11. A User Equipment (UE) (110) comprising:
one or more primary processors (305) communicatively coupled to one or more processors (205), the one or more primary processors (305) coupled with a memory (310), wherein said memory (310) stores instructions which when executed by the one or more primary processors (305) causes the UE (110) to:
send a data modelling request to an Inventory Management (230) from a user via an external interface (220), wherein the one or more processors (205) is configured to perform the steps as claimed in claim 1.

Documents

Application Documents

# Name Date
1 202321047038-STATEMENT OF UNDERTAKING (FORM 3) [12-07-2023(online)].pdf 2023-07-12
2 202321047038-PROVISIONAL SPECIFICATION [12-07-2023(online)].pdf 2023-07-12
3 202321047038-FORM 1 [12-07-2023(online)].pdf 2023-07-12
4 202321047038-FIGURE OF ABSTRACT [12-07-2023(online)].pdf 2023-07-12
5 202321047038-DRAWINGS [12-07-2023(online)].pdf 2023-07-12
6 202321047038-DECLARATION OF INVENTORSHIP (FORM 5) [12-07-2023(online)].pdf 2023-07-12
7 202321047038-FORM-26 [20-09-2023(online)].pdf 2023-09-20
8 202321047038-Proof of Right [08-01-2024(online)].pdf 2024-01-08
9 202321047038-DRAWING [02-07-2024(online)].pdf 2024-07-02
10 202321047038-COMPLETE SPECIFICATION [02-07-2024(online)].pdf 2024-07-02
11 Abstract-1.jpg 2024-08-05
12 202321047038-Power of Attorney [05-11-2024(online)].pdf 2024-11-05
13 202321047038-Form 1 (Submitted on date of filing) [05-11-2024(online)].pdf 2024-11-05
14 202321047038-Covering Letter [05-11-2024(online)].pdf 2024-11-05
15 202321047038-CERTIFIED COPIES TRANSMISSION TO IB [05-11-2024(online)].pdf 2024-11-05
16 202321047038-FORM 3 [28-11-2024(online)].pdf 2024-11-28
17 202321047038-FORM 18 [20-03-2025(online)].pdf 2025-03-20