Abstract: The present disclosure provides a system and method for dynamically ranking attributes and generating similarity measures for various entities. The system dynamically ranks premium features with a weighted distribution in the dataset and automatically derives the appropriate similarity function. Further, the system uses a knowledge-aware artificial intelligence (AI) for attribute prediction. Complex deep learning models are converted into knowledge-aware AI models using this methodology. The system trains the knowledge-aware AI models as important features are extracted which helps in reducing the computation power.
DESC:RESERVATION OF RIGHTS
[0001] A portion of the disclosure of this patent document contains material, which is subject to intellectual property rights such as but are not limited to, copyright, design, trademark, integrated circuit (IC) layout design, and/or trade dress protection, belonging to Jio Platforms Limited (JPL) or its affiliates (hereinafter referred as owner). The owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all rights whatsoever. All rights to such intellectual property are fully reserved by the owner.
FIELD OF INVENTION
[0002] The embodiments of the present disclosure generally relate to systems and methods for generating recommendations for users based on machine learning techniques. More particularly, the present disclosure relates to a system and a method for dynamically predicting, ranking attributes, and generating similarity measures.
BACKGROUND OF INVENTION
[0003] The following description of the related art is intended to provide background information pertaining to the field of the disclosure. This section may include certain aspects of the art that may be related to various features of the present disclosure. However, it should be appreciated that this section is used only to enhance the understanding of the reader with respect to the present disclosure, and not as admissions of the prior art.
[0004] Similarity is a basic building block for activities such as recommendation engines, clustering, and various classification problems in machine learning. A recommendation system is one of the most valuable approaches to offer personalized services to a user. Recommendation systems are designed to recommend things to the user based on many different factors. Some attributes are more significant in a given application than other. Different types of similarity measures have different outcome in the artificial intelligence (AI) domain, especially retail. Current systems and methods are unable to correctly identify and deliver user preferences accurately.
[0005] There is, therefore, a need in the art to provide a system and a method that can mitigate the problems associated with the prior arts.
OBJECTS OF THE INVENTION
[0006] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are listed herein below.
[0007] It is an object of the present disclosure to provide a system and a method for automatically suggesting product features to a user, their prior distribution for a domain, and identifying a right type of similarity function.
[0008] It is an object of the present disclosure to provide a system and a method that utilizes an attribute value weighting technique to build clusters with stronger intra-similarities to provide a better clustering performance.
[0009] It is an object of the present disclosure to provide a system and a method to identify and rank attributes automatically that will help in finding the similarity function accurately.
[0010] It is an object of the present disclosure to provide a system and a method that dynamically ranks premium features with weighted distribution in the dataset and automatically derives the appropriate similarity function. This helps in recommending the product, the user is most likely to buy, thereby increasing the conversion rate and the overall profit.
[0011] It is an object of the present disclosure to provide a system and a method that uses a knowledge-aware artificial intelligence (AI) model for generating complex deep learning models and automatically suggesting product features to the user.
SUMMARY
[0012] This section is provided to introduce certain objects and aspects of the present disclosure in a simplified form that are further described below in the detailed description. This summary is not intended to identify the key features or the scope of the claimed subject matter.
[0013] In an aspect, the present disclosure relates to a system for recommending one or more products. The system may include a processor and a memory operatively coupled to the processor that stores instructions to be executed by the processor. The processor may receive a data parameter from a computing device associated with a user. The computing device may be connected to the processor via a network. The data parameter may include one or more attributes associated with a plurality of entities. The processor may train the one or more attributes using a primary technique. The processor may rank the trained one or more attributes using a weighted distribution function for a particular domain. The processor may generate a similarity function mapping of the one or more attributes based on the ranking. The processor may generate, via an artificial intelligence (AI) engine, a trained model based on the similarity function mapping and provide, via the trained model, recommendations of the one or more products to the user.
[0014] In an embodiment, the processor may generate a mapping of the one or more attributes with the plurality of entities.
[0015] In an embodiment, the processor may normalize the mapping and generate an attribute distribution function to rank the trained one or more attributes.
[0016] In an embodiment, the primary technique may include at least one of an XGBoost model and a Random Forest model.
[0017] In an embodiment, the processor may use a Minkowski distance technique to generate the similarity mapping when the one or more attributes are numerical in nature.
[0018] In an embodiment, the processor may use a cosine similarity technique to generate the similarity mapping when the one or more attributes are textual in nature.
[0019] In an embodiment, the processor may remodel the trained model based on a variation in the one or more attributes.
[0020] In an embodiment, the processor may assess a performance of the trained model using a mean absolute error (MAE) and a root mean square error (RMSE).
[0021] In an aspect, the present disclosure relates to a method for recommending one or more products. The method may include receiving, by a processor associated with a system, a data parameter from a user. The data parameter may include one or more attributes associated with a plurality of entities. The method may include training, by the processor, the one or more attributes using a primary technique. The method may include ranking, by the processor, the trained one or more attributes using a weighted distribution function for a domain. The method may include generating, by the processor, a similarity function mapping of the one or more attributes based on the ranking. The method may include generating, by the processor, via an AI engine, a trained model based on the similarity function mapping, and providing, by the processor via the trained model, recommendations of the one or more products to the user.
[0022] In an embodiment, the method may include generating, by the processor, a mapping of the one or more attributes with the plurality of entities.
[0023] In an embodiment, the method may include normalizing, by the processor, the mapping and generating an attribute distribution function to rank the trained one or more attributes.
[0024] In an embodiment, the primary technique may include at least one of an XGBoost model and a Random Forest model.
[0025] In an embodiment, the method may include using, by the processor, a Minkowski distance technique to generate the similarity mapping when the one or more attributes are numerical in nature.
[0026] In an embodiment, the method may include using, by the processor, a cosine similarity technique to generate the similarity mapping when the one or more attributes are textual in nature.
[0027] In an embodiment, the method may include remodelling, by the processor, the trained model based on a variation in the one or more attributes.
[0028] In an embodiment, the method may include assessing, by the processor, a performance of the trained model using a MAE and a RMSE.
[0029] In an aspect, the present disclosure relates to a user equipment (UE) for receiving recommendations of one or more products. The UE may include one or more processors communicatively coupled to a processor associated with a system. The one or more processors may be coupled with a memory. The memory may store instructions to be executed by the one or more processors that may cause the one or more processors to transmit a data parameter to the processor via a network, and receive recommendations from the processor. The processor may be configured to receive the data parameter from the UE. The data parameter may include one or more attributes associated with a plurality of entities. The processor may train the one or more attributes using a primary technique. The processor may rank the trained one or more attributes using a weighted distribution function for a particular domain. The processor may generate a similarity function mapping of the one or more attributes based on the ranking. The processor may generate, via an AI engine, a trained model based on the similarity function mapping and provide, via the trained model, recommendations of the one or more products to a user associated with the UE.
BRIEF DESCRIPTION OF DRAWINGS
[0030] The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes the disclosure of electrical components, electronic components, or circuitry commonly used to implement such components.
[0031] FIG. 1 illustrates an exemplary network architecture (100) for implementing a proposed system (108), in accordance with an embodiment of the present disclosure.
[0032] FIG. 2 illustrates an exemplary block diagram (200) of a proposed system (108), in accordance with an embodiment of the present disclosure.
[0033] FIG. 3 illustrates an exemplary flow chart of a method (300) for dynamically predicting, ranking attributes, and generating similarity measures, in accordance with an embodiment of the present disclosure.
[0034] FIG. 4 illustrates an exemplary computer system (400) in which or with which the embodiments of the present disclosure may be implemented.
[0035] The foregoing shall be more apparent from the following more detailed description of the disclosure.
DETAILED DESCRIPTION
[0036] In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not address all of the problems discussed above or might address only some of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein.
[0037] The ensuing description provides exemplary embodiments only and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the disclosure as set forth.
[0038] Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail to avoid obscuring the embodiments.
[0039] Also, it is noted that individual embodiments may be described as a process that is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
[0040] The word “exemplary” and/or “demonstrative” is used herein to mean serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “exemplary” and/or “demonstrative” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques known to those of ordinary skill in the art. Furthermore, to the extent that the terms “includes,” “has,” “contains,” and other similar words are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term “comprising” as an open transition word without precluding any additional or other elements.
[0041] Reference throughout this specification to “one embodiment” or “an embodiment” or “an instance” or “one instance” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0042] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
[0043] The various embodiments throughout the disclosure will be explained in more detail with reference to FIGs. 1-4.
[0044] FIG. 1 illustrates an exemplary network architecture (100) for implementing a proposed system (108), in accordance with an embodiment of the present disclosure.
[0045] As illustrated in FIG. 1, the network architecture (100) may include a system (108). The system (108) may be connected to one or more computing devices (104-1, 104-2…104-N) via a network (106). The one or more computing devices (104-1, 104-2…104-N) may be interchangeably specified as a user equipment (UE) (104) and be operated by one or more users (102-1, 102-2...102-N). Further, the one or more users (102-1, 102-2…102-N) may be interchangeably referred as a user (102) or users (102). The system (108) may include an artificial intelligence (AI) engine (110) for generating a trained model based on a data parameter provided by the user (102).
[0046] In an embodiment, the computing devices (104) may include, but not be limited to, a mobile, a laptop, etc. Further, the computing devices (104) may include a smartphone, virtual reality (VR) devices, augmented reality (AR) devices, a general-purpose computer, desktop, personal digital assistant, tablet computer, and a mainframe computer. Additionally, input devices for receiving input from the user (102) such as a touch pad, touch-enabled screen, electronic pen, and the like may be used. A person of ordinary skill in the art will appreciate that the computing devices (104) may not be restricted to the mentioned devices and various other devices may be used.
[0047] In an embodiment, the network (106) may include, by way of example but not limitation, at least a portion of one or more networks having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, waves, voltage or current levels, some combination thereof, or so forth. The network (106) may also include, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, or some combination thereof.
[0048] In an embodiment, the system (108) may receive the data parameter from the user (102). The data parameter may include one or more attributes associated with a plurality of entities. The system (108) may train the one or more attributes using a primary technique. The primary technique may include, but not limited to, an XGBoost model and a Random Forest model.
[0049] In an embodiment, the system (108) may generate a mapping of the one or more attributes with the plurality of entities. Further, the system (108) may normalize the mapping and generate an attribute distribution function to rank the trained one or more attributes.
[0050] In an embodiment, the system (108) may rank the trained one or more attributes using a weighted distribution function for a particular domain. The system (108) may use a Minkowski distance technique to generate the similarity mapping when the one or more attributes are numerical in nature. Further, the system (108) may use a cosine similarity technique to generate the similarity mapping when the one or more attributes are textual in nature.
[0051] In an embodiment, the system (108) may generate a similarity function mapping of the one or more attributes based on the ranking. The system (108) may generate, via the AI engine (110), a trained model based on the similarity function mapping and recommend the one or more products to the user (102). Further, the system (108) may remodel the trained model based on a variation in the one or more attributes. The system (108) may assess a performance of the trained model using a mean absolute error (MAE) and a root mean square error (RMSE).
[0052] In an embodiment, the system (108) may use dynamic attribute prediction and ranking for generating the similarity function mapping. A weight function may be applied to each of the one or more attributes based on the importance in the given context and similarity function mapping may be calculated.
[0053] In an embodiment, the system (108) may optimize the trained model processing time and computation power with minimum computation power which may optimize the trained model processing. Further, the system (108) may use causal modelling to uncover the causes and effects of different phenomena in complex systems. Further, the system (108) may change the attribute distribution function and remodel the trained model to generate the updated results.
[0054] In an exemplary embodiment, the system (108) may calculate a person entity similarity in a movie and a grocery domain. For example, if Adam and Eve both are friends and their watched movie history shows that both love ‘horror’ movies. If Eve watches a new horror movie, it may be highly recommended to Adam as well because both are similar users in the movie domain. In this case, ‘age’ may be an important attribute in calculating person-to-person similarity and in recommending horror movies. Also, the horror movie may be an irrelevant recommendation to a similar person who is 75 years old in the movie recommendation domain.
[0055] In an exemplary embodiment, two people may want to purchase groceries. For example, Adam lives in a Metro area while Eve lives in a village. In this domain, ‘age’ may not be an important attribute, whereas ‘location’ may play an important role. As shown in Table 1, the system (108) may calculate the person similarity attributes based on the movie domain and the grocery domain.
Person Entity Similarity Attributes Movie Domain Grocery Domain
Age 0.70 0.10
Location 0.30 0.60
Name 0.10 0.10
Table 1
[0056] In an exemplary embodiment, the system (108) may calculate a person entity similarity in sports and banking domain. For example, Adam and Eve may be two people who may be interested in sports. Adam may be a National level Basketball player, while Eve may be a National level Weightlifting champion. These two person profiles may be similar because both have a good profile in sports. Based on sports training data set of these two people, it may be seen that ‘height’ and ‘age’ attributes rank higher in the Basketball sport while ‘weight’ and ‘age’ attributes rank higher in Weightlifting. Now, considering a situation where both have to fly internationally for Olympic games, both may be required to apply for an international credit card for shopping. In the banking domain, ‘height’ and ‘weight’ may carry lesser weight, while a ‘Credit Information Bureau India Limited (CIBIL) score’ and ‘salary’ may play an important role. Based on the application of the similarity function, results may be generated as shown in Table 2.
Person Entity Similarity Attributes Sports Domain Banking Domain
Basketball Weightlifting
Height 0.60 0.20 0
Weight 0.20 0.60 0
Age 0.15 0.15 0.15
CIBIL Score 0 0 0.70
Salary 0.05 0.05 0.15
Table 2
[0057] In an exemplary embodiment, the system (108) may calculate a person-brand entity similarity in a fashion domain. As discussed earlier, Adam may be a sports player and may want to order shoes for his Olympic tournament. From this information, a brand’s ‘name’ may be one of the most important attribute in a person’s point of view that carries maximum weight. The system (108) may recommend brands based on his preference. If Adam loves ‘Brand A shoes’ more than other brands, then brand affinity may also add some weightage in the recommendation. In the fashion domain, the brands may be categorized into premium sports shoe brands. For calculating brand similarity, ‘category,’ ‘price,’ and ‘collection’ attributes may rank higher than attributes like ‘headquarter’ or ‘revenue’ as shown in Table 3.
Brand Similarity Attributes Person View Brand View
Name 0.50 0
Category 0.20 0.30
Price 0.15 0.30
Collection 0.15 0.40
Headquarter 0 0
Revenue 0 0
Table 3
[0058] In an exemplary embodiment, the system (108) may calculate a product entity similarity in a grocery and a cooking domain. For example, Eve may love to cook food at home. From Eve’s past order history, the system (108) may note that when Eve buys ‘sugar,’ she buys ‘salt.’ Hence, a weight-wise distribution may be higher among these two products than another products sold at the grocery store. So, these products may be similar in a grocery shopping domain from Eve’s perspective. While cooking, salt and sugar may not be replaced with respect to each other as they are highly dissimilar in a cooking context. Both salt and sugar may carry weights depending on the recipe. Hence, more weightage to ‘sugar’ may be given for a desert recipe and ‘0’ weightage for ‘salt’ may be given for the same recipe as shown in Table 4.
Product Entity Similarity Attributes Grocery Shopping Domain Cooking Domain (Desert Recipe)
Sugar 0.50 1.00
Salt 0.50 0
Table 4
[0059] Further, in an embodiment, the system (108) may use causality modelling to uncover the causes and effects of different phenomena in complex systems and build better solutions in diverse areas. Causality modelling may allow the system (108) to identify objects regardless of subtle changes. As discussed earlier, ‘age’ may be an important attribute in the sports domain. As we go deeper, various sports focus on different attributes along with ‘age.’ The Basketball focuses on ‘height,’ whereas Weightlifting focuses on ‘weight,’ etc. As discussed earlier, age may be a ratio scaled attribute whereas ‘age’ factor increases, Adam and Eve may have to retire from playing Olympic games, which may ultimately affect their shopping habits, product recommendations, etc.
[0060] Although FIG. 1 shows exemplary components of the network architecture (100), in other embodiments, the network architecture (100) may include fewer components, different components, differently arranged components, or additional functional components than depicted in FIG. 1. Additionally, or alternatively, one or more components of the network architecture (100) may perform functions described as being performed by one or more other components of the network architecture (100).
[0061] FIG. 2 illustrates an exemplary block diagram (200) of a proposed system (108), in accordance with an embodiment of the present disclosure.
[0062] Referring to FIG. 2, the system (108) may comprise one or more processor(s) (202) that may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that process data based on operational instructions. Among other capabilities, the one or more processor(s) (202) may be configured to fetch and execute computer-readable instructions stored in a memory (204) of the system (108). The memory (204) may be configured to store one or more computer-readable instructions or routines in a non-transitory computer readable storage medium, which may be fetched and executed to create or share data packets over a network service. The memory (204) may comprise any non-transitory storage device including, for example, volatile memory such as random-access memory (RAM), or non-volatile memory such as erasable programmable read only memory (EPROM), flash memory, and the like.
[0063] In an embodiment, the system (108) may include an interface(s) (206). The interface(s) (206) may comprise a variety of interfaces, for example, interfaces for data input and output (I/O) devices, storage devices, and the like. The interface(s) (206) may also provide a communication pathway for one or more components of the system (108). Examples of such components include, but are not limited to, processing engine(s) (208) and a database (210), where the processing engine(s) (208) may include, but not be limited to, a data parameter engine (212), an AI engine (214), and a ranking engine (216). A person with ordinary skill in the art may understand that the AI engine (214) may be similar to the AI engine (110) of FIG. 1 in its functionality.
[0064] In an embodiment, the processing engine(s) (208) may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) (208). In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) (208) may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) (208) may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) (208). In such examples, the system (108) may comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the system (108) and the processing resource. In other examples, the processing engine(s) (208) may be implemented by electronic circuitry.
[0065] In an embodiment, the processor (202) may receive a data parameter from a user (102) via the data parameter engine (212). The user (102) may provide the data parameter through the computing device (104). The data parameter may include one or more attributes associated with a plurality of entities. The processor (202) may train the one or more attributes using a primary technique. The primary technique may include, but not limited to, an XGBoost model and a Random Forest model.
[0066] In an embodiment, the processor (202) may rank the trained one or more attributes using a weighted distribution function for a particular domain using the ranking engine (216). The processor (202) may use a Minkowski distance technique to generate the similarity mapping when the one or more attributes are numerical in nature. Further, the processor (202) may use a cosine similarity technique to generate the similarity mapping when the one or more attributes are textual in nature.
[0067] In an embodiment, the processor (202) may generate a similarity function mapping of the one or more attributes based on the ranking. The processor (202) may generate, via the AI engine (214), a trained model based on the similarity function mapping and recommend the one or more products to the user (102) via the trained model. Further, the processor (202) may remodel the trained model based on a variation in the one or more attributes. The processor (202) may assess a performance of the trained model using MAE and RMSE.
[0068] In an embodiment, the processor (202) may use dynamic attribute prediction and ranking for generating the similarity function mapping. A weight function may be applied to each of the one or more attributes based on the importance in the given context and similarity function mapping may be calculated.
[0069] Although FIG. 2 shows exemplary components of the system (108), in other embodiments, the system (108) may include fewer components, different components, differently arranged components, or additional functional components than depicted in FIG. 2. Additionally, or alternatively, one or more components of the system (108) may perform functions described as being performed by one or more other components of the system (108).
[0070] FIG. 3 illustrates an exemplary flow chart of a method (300) for dynamically predicting, ranking attributes, and generating similarity measures, in accordance with an embodiment of the present disclosure.
[0071] As illustrated in FIG. 3, in an embodiment, input data (302) may be provided to a system (e.g., 108). The system (108) may include a data ingestion and a pre-processing stage. The data ingestion and the pre-processing stage may include entity attribute mapping module (304) to generate a mapping of the one or more attributes with the plurality of entities. Further, the data ingestion and the pre-processing stage may include an attribute value standardization/normalization module (306) to normalize the mapping and generate an attribute distribution function to rank the trained one or more attributes.
[0072] In an embodiment, the system (108) may include a model training stage, where a feature selection module (308), an attribute weight assignment and ranking module (310), and a similarity function mapping module (312) may be included. The feature selection module (308) may select various features from the one or more attributes of the plurality of entities. The attribute weight assignment and ranking module (310) may rank the trained one or more attributes using a weighted distribution function for a particular domain. The similarity function mapping module (312) may generate a similarity function mapping of the one or more attributes based on the ranking. Further, the system (108) may generate, via an AI engine (e.g., 110 or 214), a trained model based on the similarity function mapping and recommend the one or more products to the user (102).
[0073] In an embodiment, the system (108) may include a model performance stage, where the generated trained model may be analyzed using a MAE module (314) and a RMSE module (316).
[0074] In an embodiment, the system (108) may generate the similarity based recommendation results (318) based on the trained model and recommend the one or more products to the user (102).
[0075] In an embodiment, the system (108) may calculate the attribute distribution function as follows:
[0076] If the attributes are numerical, the system (108) may calculate a Minkowski distance as follows:
If p = 1, it will be treated as Manhattan distance
If p = 2, it will be treated as Euclidean distance
Weighted norm of vector and
[0077] Further, the system (108) may calculate the Cosine Similarity as follows:
Cosine similarity =
Where A.B may generate the dot product of vector A and B.
[0078] Further, the system (108) may calculate a weight function W in the cosine similarity.
[0079] FIG. 4 illustrates an exemplary computer system (400) in which or with which embodiments of the present disclosure may be implemented.
[0080] As shown in FIG. 4, the computer system (400) may include an external storage device (410), a bus (420), a main memory (430), a read-only memory (440), a mass storage device (450), a communication port(s) (460), and a processor (470). A person skilled in the art will appreciate that the computer system (400) may include more than one processor and communication ports. The processor (470) may include various modules associated with embodiments of the present disclosure. The communication port(s) (460) may be any of an RS-232 port for use with a modem-based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or future ports. The communication ports(s) (460) may be chosen depending on a network, such as a Local Area Network (LAN), Wide Area Network (WAN), or any network to which the computer system (400) connects.
[0081] In an embodiment, the main memory (430) may be Random Access Memory (RAM), or any other dynamic storage device commonly known in the art. The read-only memory (440) may be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chip for storing static information e.g., start-up or basic input/output system (BIOS) instructions for the processor (470). The mass storage device (450) may be any current or future mass storage solution, which can be used to store information and/or instructions. Exemplary mass storage solutions include, but are not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial Bus (USB) and/or Firewire interfaces).
[0082] In an embodiment, the bus (420) may communicatively couple the processor(s) (470) with the other memory, storage, and communication blocks. The bus (420) may be, e.g. a Peripheral Component Interconnect PCI) / PCI Extended (PCI-X) bus, Small Computer System Interface (SCSI), universal serial bus (USB), or the like, for connecting expansion cards, drives, and other subsystems as well as other buses, such a front side bus (FSB), which connects the processor (470) to the computer system (400).
[0083] In another embodiment, operator and administrative interfaces, e.g., a display, keyboard, and cursor control device may also be coupled to the bus (420) to support direct operator interaction with the computer system (400). Other operator and administrative interfaces can be provided through network connections connected through the communication port(s) (460). Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system (400) limit the scope of the present disclosure.
[0084] While considerable emphasis has been placed herein on the preferred embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the disclosure. These and other changes in the preferred embodiments of the disclosure will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be implemented merely as illustrative of the disclosure and not as a limitation.
ADVANTAGES OF THE INVENTION
[0085] The present disclosure provides a system and a method for automatically suggesting product features to a user, their prior distribution for a domain, and identifying a right type of similarity function.
[0086] The present disclosure provides a system and a method that utilizes an attribute value weighting technique to build clusters with stronger intra-similarities to provide a better clustering performance.
[0087] The present disclosure provides a system and a method to identify and rank attributes automatically that will help in finding the similarity function accurately.
[0088] The present disclosure provides a system and a method that dynamically ranks premium features with weighted distribution in the dataset and automatically derives the appropriate similarity function. This helps in recommending the product that the user is most likely to buy, thereby increasing the conversion rate and the overall profit.
[0089] The present disclosure provides a system and a method that uses a knowledge-aware artificial intelligence (AI) model for generating complex deep learning models and automatically suggesting product features to the user.
[0090] The present disclosure provides a system and a method that uses causal modelling to reduce a cause-effect where the attribute distribution function will change and will run the model again to update the results for better similarity matching.
[0091] The present disclosure provides a system and a method that minimizes dependency on complex AI systems to drive similarity measures while increasing the model performance and efficiency.
[0092] The present disclosure provides a system and a method that increases predictive performance and resource optimization by learning prior weighted distribution of the important features.
,CLAIMS:1. A system (108) for recommending one or more products, the system (108) comprising:
a processor (202); and
a memory (204) operatively coupled with the processor (202), wherein said memory (204) stores instructions which, when executed by the processor (202), cause the processor (202) to:
receive a data parameter from a computing device (104) associated with a user (102), wherein the computing device (104) is connected to the processor (202) via a network (106), and wherein the data parameter comprises one or more attributes associated with a plurality of entities;
train the one or more attributes using a primary technique;
rank the trained one or more attributes using a weighted distribution function for a domain;
generate a similarity function mapping of the one or more attributes based on the ranking;
generate, via an artificial intelligence (AI) engine (110), a trained model based on the similarity function mapping; and
provide, via the trained model, recommendations of the one or more products to the user (102).
2. The system (108) as claimed in claim 1, wherein the processor (202) is to generate a mapping of the one or more attributes with the plurality of entities.
3. The system (108) as claimed in claim 2, wherein the processor (202) is to normalize the mapping and generate an attribute distribution function to rank the trained one or more attributes.
4. The system (108) as claimed in claim 1, wherein the primary technique comprises at least one of: an XGBoost model and a Random Forest model.
5. The system (108) as claimed in claim 1, wherein the processor (202) is to use a Minkowski distance technique to generate the similarity mapping when the one or more attributes are numerical in nature.
6. The system (108) as claimed in claim 1, wherein the processor (202) is to use a cosine similarity technique to generate the similarity mapping when the one or more attributes are textual in nature.
7. The system (108) as claimed in claim 1, wherein the processor (202) is to remodel the trained model based on a variation in the one or more attributes.
8. The system (108) as claimed in claim 1, wherein the processor (202) is to assess a performance of the trained model using at least one of: a mean absolute error (MAE) and a root mean square error (RMSE).
9. A method for recommending one or more products, the method comprising:
receiving, by a processor (202) associated with a system (108), a data parameter from a user (102), wherein the data parameter comprises one or more attributes associated with a plurality of entities;
training, by the processor (202), the one or more attributes using a primary technique;
ranking, by the processor (202), the trained one or more attributes using a weighted distribution function for a domain;
generating, by the processor (202), a similarity function mapping of the one or more attributes based on the ranking;
generating, by the processor (202), via an artificial intelligence (AI) engine (110), a trained model based on the similarity function mapping; and
providing, by the processor (202) via the trained model, recommendations of the one or more products to the user (102).
10. The method as claimed in claim 9, comprising generating, by the processor (202), a mapping of the one or more attributes with the plurality of entities.
11. The method as claimed in claim 10, comprising normalizing, by the processor (202), the mapping and generating an attribute distribution function to rank the trained one or more attributes.
12. The method as claimed in claim 9, wherein the primary technique comprises at least one of: an XGBoost model and a Random Forest model.
13. The method as claimed in claim 9, comprising using, by the processor (202), a Minkowski distance technique to generate the similarity mapping when the one or more attributes are numerical in nature.
14. The method as claimed in claim 9, comprising using, by the processor (202), a cosine similarity technique to generate the similarity mapping when the one or more attributes are textual in nature.
15. The method as claimed in claim 9, comprising remodelling, by the processor (202), the trained model based on a variation in the one or more attributes.
16. The method as claimed in claim 9, comprising assessing, by the processor (202), a performance of the trained model using a mean absolute error (MAE) and a root mean square error (RMSE).
17. A user equipment (UE) (104) for receiving recommendations of one or more products, the UE (104) comprising:
one or more processors communicatively coupled to a processor (202) of a system (108), wherein the one or more processors are coupled with a memory, and wherein said memory stores instructions which, when executed by the one or more processors, cause the one or more processors to:
transmit a data parameter to the processor (202) via a network (106); and
receive recommendations from the processor (202),
wherein the processor (202) is configured to:
receive the data parameter from the UE (104), wherein the data parameter comprises one or more attributes associated with a plurality of entities;
train the one or more attributes using a primary technique;
rank the trained one or more attributes using a weighted distribution function for a particular domain;
generate a similarity function mapping of the one or more attributes based on the ranking;
generate, via an artificial intelligence (AI) engine (110), a trained model based on the similarity function mapping; and
provide, via the trained model, recommendations of the one or more products to a user (102) associated with the UE (104).
| # | Name | Date |
|---|---|---|
| 1 | 202221031078-STATEMENT OF UNDERTAKING (FORM 3) [31-05-2022(online)].pdf | 2022-05-31 |
| 2 | 202221031078-PROVISIONAL SPECIFICATION [31-05-2022(online)].pdf | 2022-05-31 |
| 3 | 202221031078-POWER OF AUTHORITY [31-05-2022(online)].pdf | 2022-05-31 |
| 4 | 202221031078-FORM 1 [31-05-2022(online)].pdf | 2022-05-31 |
| 5 | 202221031078-DRAWINGS [31-05-2022(online)].pdf | 2022-05-31 |
| 6 | 202221031078-DECLARATION OF INVENTORSHIP (FORM 5) [31-05-2022(online)].pdf | 2022-05-31 |
| 7 | 202221031078-ENDORSEMENT BY INVENTORS [29-05-2023(online)].pdf | 2023-05-29 |
| 8 | 202221031078-DRAWING [29-05-2023(online)].pdf | 2023-05-29 |
| 9 | 202221031078-CORRESPONDENCE-OTHERS [29-05-2023(online)].pdf | 2023-05-29 |
| 10 | 202221031078-COMPLETE SPECIFICATION [29-05-2023(online)].pdf | 2023-05-29 |
| 11 | 202221031078-FORM-8 [30-05-2023(online)].pdf | 2023-05-30 |
| 12 | 202221031078-FORM 18 [30-05-2023(online)].pdf | 2023-05-30 |
| 13 | Abstract1.jpg | 2023-10-28 |
| 14 | 202221031078-FORM-26 [28-02-2025(online)].pdf | 2025-02-28 |
| 15 | 202221031078-FER.pdf | 2025-04-08 |
| 16 | 202221031078-FER_SER_REPLY [07-10-2025(online)].pdf | 2025-10-07 |
| 17 | 202221031078-CORRESPONDENCE [07-10-2025(online)].pdf | 2025-10-07 |
| 18 | 202221031078-COMPLETE SPECIFICATION [07-10-2025(online)].pdf | 2025-10-07 |
| 19 | 202221031078-CLAIMS [07-10-2025(online)].pdf | 2025-10-07 |
| 1 | SearchHistoryE_20-02-2024.pdf |