Sign In to Follow Application
View All Documents & Correspondence

System And Method For Processing Search Queries

Abstract: ABSTRACT SYSTEM AND METHOD FOR PROCESSING SEARCH QUERIES The present disclosure relates to a system (100) and a method (400) processing search queries. The method (400) further includes the step of receiving a search query from a user via a User Equipment (UE) (205). The method (400) further includes the step of modifying utilizing a Machine Learning (ML) unit characters of the search query based on one or more pre-defined rules. The method (400) further includes the step of extracting at least one of, one or more categories, sub-categories and attributes, from the modified characters of the search query. The method (400) includes the step of creating utilizing an embedding model vector embedding of the extracted at least one of, the one or more categories, sub-categories and attributes, from the modified characters of the search query. The method (400) includes the step of generating one or more metadata filters for the extracted at least one of, the one or more categories, sub-categories and attributes. The method (400) includes the step of storing the generated one or more metadata filters along with the created vector embedding in a cache data store and a vector database (120) and in response to receiving one or more subsequent search queries from the user. The method (400) further includes the step of displaying via the UE (205) one or more relevant search results pertaining to one or more items. Ref. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
02 August 2024
Publication Number
16/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

TATA DIGITAL PVT. LTD
Army & Navy Building, 148 M.G. Road, Opp. Kala Ghoda, Fort, Mumbai – 400001, India

Inventors

1. Akash Anandan
Army & Navy Building, 148 M.G. Road, Opp. Kala Ghoda, Fort, Mumbai – 400001, India
2. Vikash Kumar
Army & Navy Building, 148 M.G. Road, Opp. Kala Ghoda, Fort, Mumbai – 400001, India
3. Vikrant Sharma
Army & Navy Building, 148 M.G. Road, Opp. Kala Ghoda, Fort, Mumbai – 400001, India
4. Nagarajan Karuppiah
Army & Navy Building, 148 M.G. Road, Opp. Kala Ghoda, Fort, Mumbai – 400001, India

Specification

Description:
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003

COMPLETE SPECIFICATION
(See section 10 and rule 13)
1. TITLE OF THE INVENTION
SYSTEM AND METHOD FOR PROCESSING SEARCH QUERIES

2. APPLICANT(S)
NAME NATIONALITY ADDRESS
Tata Digital Pvt. Ltd. INDIAN
Army & Navy Building, 148 M.G. Road, Opp. Kala Ghoda, Fort, Mumbai – 400001, India
3.PREAMBLE TO THE DESCRIPTION

THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE NATURE OF THIS INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED.

FIELD OF THE INVENTION
[0001] The present invention is related to the search systems and methods, more particularly relates to a method and system for processing search queries.
BACKGROUND OF THE INVENTION
[0002] With the advent of internet, search engines are widely used for retrieval of any kind of information. Search engines have traditionally been driven by catalog hierarchy, search, and filtering, where the search is primarily keyword-based and works on the reverse index of product catalog. However, the search results obtained may still not be relevant to the user.
[0003] Further, with the introduction of conversational (chat) systems in search engines, a set of well-defined conversation flow gets initiated based on certain triggers, and subsequently guides a user step-by-step towards desired outcome or query resolution. While these systems have proved to be very useful for certain use cases, these systems are not very conducive towards a free-flowing conversation for search systems/engines, thereby again not providing relevant search results to the user.
[0004] In view of the above, there is dire requirement for systems and methods for processing search queries which are efficient and solve the above-mentioned drawbacks.
SUMMARY OF THE INVENTION
[0005] One or more embodiments of the present invention provides a method and a system for processing search queries.
[0006] In one aspect of the present invention, a method for processing search queries is provided. The method includes the step of receiving a search query from a user via a User Equipment (UE). The method further includes the step, of modifying utilizing a Machine Learning (ML) unit, characters of the search query based on one or more pre-defined rules. The method further includes the step of extracting at least one of, one or more categories, sub-categories and attributes, from the modified characters of the search query. The method further includes the steps of creating utilizing an embedding model, vector embedding of the extracted at least one of, the one or more categories, sub-categories and attributes from the modified characters of the search query. The method further includes the step of, generating one or more metadata filters for the extracted at least one of, the one or more categories, sub-categories and attributes. The method further includes the step of, storing the generated one or more metadata filters along with the created vector embedding in a cache data store and a vector database and in response to receiving one or more subsequent search queries from the user displaying via the UE, one or more relevant search results pertaining to one or more items.
[0007] In an embodiment, the search query is inputted by the user via the UE by input means including at least one of, touchscreen, keypad and voice instructions.
[0008] In an embodiment, the one or more pre-defined rules for modifying the characters of the search query include at least one of, spelling correction and synonym substitution.
[0009] In an embodiment, the step of extracting at least one of, one or more categories, sub-categories and attributes from the modified characters of the search query, includes the steps of comparing keywords from the search query with a catalogue of categories, sub-categories and attributes to extract at least one of, the categories, sub-categories and attributes.
[0010] In an embodiment, the step of, in response to receiving one or more subsequent search queries from the user, displaying via the UE one or more relevant search results includes the steps of, identifying one or more keywords from the one or more subsequent search queries, checking at the cache data store if the one or more metadata filters and the vector embedding are present based on the identified one or more keywords from the one or more subsequent search queries, retrieving from the cache data store, the one or more metadata filters and the vector embedding if determined present at the cache data store, retrieving from the vector database, the one or more metadata filters and the vector embedding if determined not present at the cache data store, identifying the one or more relevant search results from at least one of the vector database and one or more external databases based on retrieving the one or more metadata filters and the vector embedding.
[0011] In an embodiment, the method further comprises the steps of re-ranking the one or more search results in an order of relevance based on at least one of, frequency of occurrence of keywords from the one or more subsequent search queries. The one or more search results are placed prior in the order which have a corresponding frequency of occurrence of keywords higher than the rest of the keywords for the given user.
[0012] In an embodiment, the one or more search results are re-ranked by applying a weightage value for each of the one or more search results.
[0013] In an embodiment, the method further comprises the step of providing a conversational chat medium for the user via the UE to communicate with respect to search queries, feedback and complaints.
[0014] In an embodiment, the step of receiving a search query from a user via a User Equipment (UE) includes the steps of, receiving at least a portion of the search query from the user via the UE, prompting the user via the UE to receive one or more portions of the search query from the user, and receiving the one or more portions of the search query based on prompting, thereby receiving the complete search query from the user.
[0015] In an embodiment, the prompting includes at least one of, requesting the user to answer questions based on the received at least the portion of the search query.
[0016] In an embodiment, the method further includes the steps of, periodically monitoring one or more external databases to locate the one or more items which are not present in the vector database, retrieving the one or more items located from the one or more external databases, converting the format of the retrieved one or more items into an acceptable format, periodic updating the vector database with the located one or more items.
[0017] In another aspect of the present invention, the system for processing search queries is provided. The system includes a transceiver unit configured to receive a search query from a user via a User Equipment (UE). The system further includes a modifying unit configured to modify utilizing a Machine Learning (ML) unit, characters of the search query based on one or more pre-defined rules. The system further includes an extraction unit configured to extract at least one of, one or more categories, sub-categories and attributes, from the modified characters of the search query. The system further includes a storing unit configured to store the generated one or more metadata filters along with the created vector embedding in a cache data store and a vector database, and in response to receiving one or more subsequent search queries from the user, the system further includes a display module configured to display via the UE, one or more relevant search results pertaining to one or more items.
[0018] In another aspect of the present invention, a User Equipment (UE) is disclosed. One or more primary processors communicatively coupled to one or more processors. The one or more primary processors are coupled with a memory. The memory stores instructions which when executed by the one or more primary processors causes the UE to transmit a search query and one or more subsequent search queries to the one or more processors.
[0019] Other features and aspects of this invention will be apparent from the following description and the accompanying drawings. The features and advantages described in this summary and in the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the relevant art, in view of the drawings, specification, and claims hereof. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes and may not have been selected to delineate or circumscribe the inventive subject matter, resort to the claims being necessary to determine such inventive subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components, electronic components or circuitry commonly used to implement such components.
[0021] FIG. 1 is an exemplary system diagram for processing search queries, according to one or more embodiments of the present disclosure;
[0022] FIG. 2 is a schematic representation of a workflow of the system of FIG. 2 communicably coupled with a User equipment (UE), according to one or more embodiments of the present disclosure;
[0023] FIG. 3 is an exemplary block diagram of an architecture of the system of the FIG. 1, according to one or more embodiments of the present disclosure; and
[0024] FIG. 4 is a flow chart illustrating a method for processing search queries, according to one or more embodiments of the present disclosure.
[0025] The foregoing shall be more apparent from the following detailed description of the invention.

DETAILED DESCRIPTION OF THE INVENTION
[0026] Some embodiments of the present disclosure, illustrating all its features, will now be discussed in detail. It must also be noted that as used herein and in the appended claims, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise.
[0027] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure including the definitions listed here below are not intended to be limited to the embodiments illustrated but is to be accorded the widest scope consistent with the principles and features described herein.
[0028] A person of ordinary skill in the art will readily ascertain that the illustrated steps detailed in the figures and here below are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0029] FIG. 1 illustrates an exemplary block diagram of a system 100 for processing search queries, according to one or more embodiments of the present disclosure.
[0030] As per the illustrated embodiment, the system 100 includes one or more processors 105, a memory 110, a user interface 115, a vector database 120, a catch data store 125. For the purpose of description and explanation, the description will be explained with respect to one processor 105 and should nowhere be construed as limiting the scope of the present disclosure. In alternate embodiments, the system 120 may include more than one processor 105 as per the requirement of the network 200 (as shown in the FIG 2). The one or more processors 105, hereinafter referred to as the processor 105 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, single board computers, and/or any devices that manipulate signals based on operational instructions.
[0031] As per the illustrated embodiment, the processor 105 is configured to fetch and execute computer-readable instructions stored in the memory 110. The memory 110 may be configured to store one or more computer-readable instructions or routines in a non-transitory computer-readable storage medium which may be fetched and executed to display the enriched data to the user via the user interface in order to perform analysis. The memory 110 may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as disk memory, EPROMs, FLASH memory, unalterable memory, and the like.
[0032] In an embodiment, the user interface 115 includes a variety of interfaces, for example, interfaces for data input and output devices, referred to as Input/Output (I/O) devices, storage devices, and the like. The user interface 115 facilitates communication of the system 100. In one embodiment, the user interface 115 provides a communication pathway for one or more components of the system 100.
[0033] In an embodiment, the vector database 120 is one of, but not limited to, a centralized database, a cloud-based database, a commercial database, an open-source database, a distributed database, an end-user database, a graphical database, a No-Structured Query Language (NoSQL) database, an object-oriented database, a personal database, an in-memory database, a document-based database, a time series database, a wide column database, a key value database, a search database, a cache databases, and so forth. The foregoing examples of vector database 120 types are non-limiting and may not be mutually exclusive e.g., the vector database 120 can be both commercial and cloud-based, or both relational and open-source, etc.
[0034] In order for the system 100 to process search queries, the processor 105 includes one or more modules/units. In one embodiment, the one or more modules includes, but not limited to, a transceiver 130, a prompting unit 135, an integration unit 140, a modifying unit 145, an extraction unit 150, a creating unit 155, a generating unit 160, a storing unit 165, an identification unit 170, a checking unit 175, a retrieving unit 180, a re-ranking unit 185, a chat creation unit 190, a monitoring unit 195 and a display module 198.
[0035] The transceiver 130, the integration unit 140, the modifying unit 145, the extraction unit 150, the creating unit 155, the generating unit 160, the storing unit 165, the identification unit 170, the checking unit 175, the retrieving unit 180, the re-ranking unit 185, the chat creation unit 190, the monitoring unit 195 and the display module 198, in an embodiment, may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor 105. In the examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor 105 may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for processor 105 may comprise the processing 105 resource (for example, one or more processors), to execute such instructions. In the present examples, the memory 110 may store instructions that, when executed by the processing resource, implement the processor 105. In such examples, the system 100 may comprise the memory 110 storing the instructions and the processing resource to execute the instructions, or the memory 110 may be separate but accessible to the system 100 and the processing resource. In other examples, the processor 105 may be implemented by electronic circuitry.
[0036] The transceiver unit 130 is configured to receive a search query from the user via a User Equipment (UE) 205 (As shown in FIG. 2). In an embodiment, the search query is inputted by the user via the UE 205 by input means including at least one of, touchscreen, keypad and voice instructions. In an embodiment, the touchscreen is the user interface device that detects and responds to touch inputs on its display screen and also employs capacitive or /and resistive technology to register the location and pressure of touch points accurately. Further, the users interact with the touchscreen by physically contacting its surface enabling the input of, but not limited to, commands, text entry, or/and selection of options through intuitive gestures such as, but not limited to tapping, swiping, and pinching.
[0037] In an embodiment, the UE 205 incudes the physical keypad and an on-screen virtual keypad. The users can input search queries by pressing keys corresponding to letters or numbers. The physical keypad method is common in devices like feature phones or/and industrial equipment. Further, the users can input the search query by speaking to the UE 205, thereby the search queries will get recognized using speech recognition capabilities of the UE 205. The voice instruction method is convenient for hands-free operation and accessibility purposes.
[0038] In an embodiment, upon receiving at least a portion of the search query by the transceiver unit 130 from the user via the UE 205, the integration unit 140 of the system 100 is configured to prompt the user to receive one or more additional portions of the search query. In an embodiment, the integration unit 140 prompts the user to provide the one or more additional portions of the search query based on techniques such as, but not limited to, questions to which the user is required to provide answers. After receiving the one or more additional portions of the search query, the integration unit 140 is configured to integrate all the one or more portions of the search query to create a single complete search query.
[0039] Upon receiving the complete search query from the user, the modifying unit 145 is configured to modify characters of the search query based on one or more pre-defined rules utilizing a Machine Learning (ML) unit. In an embodiment, the one or more pre-defined rules include rules for modifying the characters of the search query. For example, the one or more pre-defined rules include spelling correction rules and synonym substitution. The modifying unit 145 modifies the characters of the search query, by comparing the search query with one or more synonyms. Further, if any spelling corrections are required, the modifying unit 145 will compare the characters of the search query with a catalogue of words with correct spellings. Based on comparison, the modifying unit 145 will correct spelling mistakes and also replace the search query with the one or more synonyms in order to increase the efficiency for generating search results using the search query.
[0040] In an embodiment, the spelling correction refers to the process of identifying and rectifying errors in the spelling of words within the search query or/and any textual input. The spelling correction aims to improve the accuracy and relevance of the search query by ensuring that the words are correctly spelt according to standard language rules, and also facilitates some of, but not limited to, typographical errors, common misspellings, contextual spelling corrections. The spelling correction processes typically leverage dictionaries, language models, and sometimes machine learning techniques to identify and suggest corrections for misspelled words.
[0041] In an embodiment, the synonym substitution refers to replacing words in the search query with other words that have similar meanings to improve the relevance of search results.
[0042] Upon modifying the search query based on the one or more pre-defined rules, the extraction unit 150 is configured to extract at least one of, one or more categories, sub-categories and attributes from the modified characters of the search query. In an embodiment, the extraction unit 150 extracts at least one of, one or more categories, sub-categories and attributes from the modified characters of the search query by, comparing keywords from the search query with a catalogue of categories, sub-categories and attributes. Thereafter, the extraction unit 150 extracts at least one of, the one or more categories, sub-categories and attributes based on the comparison. Let us consider the search query input by the user is “blue watch of price range INR 5000-10,000”. In this regard, the extraction unit 150 may extract categories/sub-categories/attributes such as “blue”, “watch” and “price range of INR 5000-10,000”.
[0043] In one embodiment, the categories and sub-categories help to organize and classify search results into meaningful groups. The categories refer to broad classifications or/and groupings that items or/and information may belong. The sub-categories are subdivisions within broader categories and provide more specific classifications of items or/and information. The attributes refer to specific characteristics and properties that describe items or/and information and provide details about at least some of, but not limited to features, qualities, specifications of the product or/and topic.
[0044] Upon extracting the one or more categories, sub-categories and attributes of the search query, the creating unit 155 is configured to create utilizing an embedding model, vector embedding of the extracted at least one of, the one or more categories, sub-categories and attributes from the modified characters of the search query.
[0045] In an embodiment, the embedding model is a method to transform at least one of, but not limited to, words, phrases, or/and text into numerical vector embedding. The vector embedding represents the semantic relationships and contextual meanings of the categories, sub-categories and the attributes in the mathematical format and helps machines to better understand and analyze language.
[0046] In another embodiment, the vector embedding refers to a technique used in machine learning and natural language processing to represent words, phrases, or documents as numerical vectors in a continuous vector space. The vector embedding includes at least one of, but not limited to word embeddings, sentence or document embeddings, image embeddings. Additionally, the vector embeddings provide a powerful way to represent and operate complex data types such as, but not limited to text and images in the computationally efficient manner and also facilitating the wide range of Machine Learning (ML) and Artificial Intelligence (AI) applications.
[0047] In an embodiment, the generating unit 160 is configured to generate one or more metadata filters for the extracted at least one of, the one or more categories, sub-categories and attributes. In an embodiment, the metadata filters facilitate in organizing and categorizing search results more effectively. Further, the metadata filters help the users to narrow down the search based on specific criteria like at least one of, but not limited to, price and features. In the above-mentioned example for the extracted categories/sub-categories/attributes, such as, “watch”, “blue” and “price range INR5000-10000”, the corresponding metadata filters are generated.
[0048] Upon generating the one or more metadata filters, the storing unit 165 is configured to store the generated one or more metadata filters along with the created vector embedding in the cache data store and the vector database 120.
[0049] In an embodiment, the vector database 120 refers to the storage system or/and vector database 120 specifically designed to store vector embeddings generated from metadata filters, the vector database 120 is used to efficiently store and manage vector embedding created from the metadata filters. Further, the vector database 120 is frequently integrated with a cache data store. The cache data store helps in fast retrieval and temporary storage of frequently accessed vector embeddings.
[0050] Further, in response to receiving one or more subsequent search queries from the user, the display module 198 is configured to display via the UE 205, one or more relevant search results pertaining to one or more items. In order to display the relevant search results, an identification unit 170 is configured to first identify one or more keywords from the one or more subsequent search queries. Thereafter, a checking unit 175 checks at the cache data store 125 if the one or more metadata filters and the vector embedding are present based on the identified one or more keywords. Thereafter, the retrieving unit 180 is configured to retrieve from the cache data store 125, the one or more metadata filters and the vector embedding if determined present at the cache data store 125. In the event, the one or more metadata filters and the vector embedding if determined not present at the cache data store 125, then the retrieving unit 180 retrieves the one or more metadata filters and the vector embedding from the vector database 120. Thereafter, the identification unit 170 is configured to identify the one or more relevant search results from at least one of, the vector database 120 and one or more external database based on retrieving the one or more metadata filters and the vector embedding.
[0051] In an embodiment, the re-ranking unit 185 re-ranks the one or more search results in an order of relevance. The order of relevance includes at least one of, but not limited to, frequency of occurrence of keywords from the one or more subsequent search queries. The one or more search results are placed prior in the order which have a corresponding frequency of occurrence of keywords higher than the rest of the keywords for the given user. In an embodiment, the frequency of keyword occurrence refers to how regularly particular keywords appear across multiple search queries made by the same user, indicating topics of heightened interest or relevance to that user.
[0052] In an embodiment, the one or more search results are re-ranked by applying a weightage value for each of the one or more search results. The criteria for assigning the weightage value for each search results may include at least one of, but not limited to, relevance of the search query such as but not limited to, user engagement metrics, freshness of content, and other contextual factors. By applying the weightage values to re-rank search results enhances the precision and effectiveness of search engine techniques delivering more tailored and valuable content to users based on their respective search queries.
[0053] As an example, let us consider the same user who has input the search query “blue watch with price range of INR 5000-10000” for one or more subsequent search queries. In this regard, the re-ranking unit re-ranks the one or more search results based on the order of relevance. The order of relevance as indicated above is based on the frequency of occurrence of the keywords. In the present example, since the subsequent search queries have the keywords “watch, blue and price range of INR 5,000-10,000”, these keywords “watch, blue and price range of INR 5,000-10,000” would be given higher importance by assigning the weightage value and re-ranked higher compared to the rest of the keywords. To this effect, whenever the user in the future just provides the search query as “watch”, during this situation, the display module 198 is configured to display the one or more relevant search results, herein “blue watch in the price range of INR 5000-10000”.
[0054] Further, the chat creation unit 190 is configured to provide the conversational chat medium for the user via the UE 205 to communicate with respect to at least one of, but not limited to, search queries, feedback and complaints. The chat medium created advantageously ensures a more interactive user experience for the user to input search queries and receive relevant search results.
[0055] Further, the monitoring unit 195 is configured to periodically monitor one or more external database to locate the one or more items which are not present in the vector database 120. Based on monitoring, the monitoring unit 195 is configured to retrieve the one or more items located from the one or more external databases. Thereafter, the formats of the retrieved one or more items are converted into an acceptable format. Further, the monitoring unit 195, is configured to periodically update the vector database 120 with the located one or more items.
[0056] FIG. 2 describes an embodiment of the system 100 of FIG. 1, according to various embodiments of the present invention. It is to be noted that the embodiment with respect to FIG. 2 will be explained with respect to the UE 205 and the system 100 via a network 200 for the purpose of description and illustration and should nowhere be construed as limited to the scope of the present disclosure.
[0057] In an embodiment, the UE 205 is one of, but not limited to, any electrical, electronic, electro-mechanical or an equipment and a combination of one or more of the above devices such as virtual reality (VR) devices, augmented reality (AR) devices, laptop, a general-purpose computer, desktop, personal digital assistant, tablet computer, mainframe computer, or any other computing device.
[0058] For the purpose of description and explanation, the description will be explained with respect to the UE 205 should nowhere be construed as limiting the scope of the present disclosure. The UE 205 aids a user to interact with the system 100. In alternate embodiments, the UE 205 may include a plurality of UEs as per the requirement. The UE 205 may include an external storage device, a bus, a main memory, a read-only memory, a mass storage device, communication port(s), and a processor. The exemplary embodiment as illustrated in FIG. 2 will be explained with respect to the UE 205 without deviating from the scope of the present disclosure and limiting the scope of the present disclosure. The first UE 205 includes one or more primary processors 215 communicably coupled to the one or more processors 105 of the system 100.
[0059] The one or more primary processors 210 are coupled with the memory 215 storing instructions which are executed by the one or more primary processors 210. Execution of the stored instructions by the one or more primary processors 210 enables the UE 205 to transmit the search query and one or more subsequent search queries to the one or more processors for processing 105 search queries.
[0060] The network 105 may include, by way of example but not limitation, at least a portion of one or more network having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, waves, voltage or current levels, some combination thereof, or so forth. The network 105 may also include, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, or some combination thereof.
[0061] The network 200 includes, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, or some combination thereof. The network 200 may include, but is not limited to, a Third Generation (3G), a Fourth Generation (4G), a Fifth Generation (5G), a Sixth Generation (6G), a New Radio (NR), a Narrow Band Internet of Things (NB-IoT), an Open Radio Access Network (O-RAN), and the like.
[0062] The network 200 may also include, by way of example but not limitation, at least a portion of one or more network having one or more nodes that transmit, receive, forward, generate, buffer, store, route, switch, process, or a combination thereof, etc. one or more messages, packets, signals, waves, voltage or current levels, some combination thereof, or so forth. The network 200 may also include, by way of example but not limitation, one or more of a wireless network, a wired network, an internet, an intranet, a public network, a private network, a packet-switched network, a circuit-switched network, an ad hoc network, an infrastructure network, a Public-Switched Telephone Network (PSTN), a cable network, a cellular network, a satellite network, a fiber optic network, a VOIP or some combination thereof.
[0063] As per the illustrated embodiment, the system 100 includes the one or more processors 105, the memory 110, the user interface 115, and the vector database 120, the catch data 125. The operations and functions of the one or more processors 105, the memory 110, the user interface 115, and the vector database 120 are already explained in FIG. 1. For the sake of brevity, a similar description related to the working and operation of the system 100 as illustrated in FIG. 2 has been omitted to avoid repetition.
[0064] FIG. 3 is an exemplary architecture 300 which can be deployed in the system 100 for processing search queries, according to one or more embodiments of the present invention. The exemplary architecture as illustrated in the FIG. 3 includes a user interface 115, an LLM (Large Language Model)-based bot framework 305, a LLM agent 310, a item search service 315, a vector database 120, a backend data Application Programming Interface (API)s 320 and a batch processor 325.
[0065] In one embodiment, upon receiving the search query from the user via the UE 205, the user interface 115 in this context serves as the primary medium for the users to interact with the LLM-based bot framework 305. The user interface 115 collects user inputs such as, but not limited to, text messages, clicks on buttons, or/and selection of options presented by the bot. Further, the input is processed by the LLM-based bot framework 305 and transmits to the LLM agent 310.
[0066] Upon receiving the input from the LLM-based bot framework 305, the LLM agent 310 facilitates in initiating chat conversations and managing the conversation between the users and the LLM-based bot framework 305. The LLM agent 310 using a signal ReAct generates prompts and responses that dynamically adjusts based on various factors such as, but not limited to user input, context of the conversation, and user behavior. Further, the LLM agent 310 processes user messages to understand intents and extracts relevant information, and once the LLM agent 310 generates the response based on ReAct prompting principles, the LLM agent 310 sends the response back to the LLM-based bot framework 305.
[0067] In an embodiment, the LLM-based bot framework 305 facilitates the responses generated by the LLM agent 310 and formats appropriately for the specific communication channel at least one of, but not limited to text, cards, buttons and then delivers to the user interface 115. Additionally, the LLM agent 310 decides which tools to call based on the descriptions of the components in the architecture 300 and transmits to the item search service 315.
[0068] In an embodiment, the LLM agent 310 identifies relevant items and uses the item search tool to refine the user search queries. The tool ensures the search queries are clear, free of spelling errors, and includes only supported items attributes like categories/subcategories/attributes and then refines the input. The tool thereafter internally calls the item search service 315 with the updated queries.
[0069] Upon refining the input in the item search service 315 with updated search queries, the item search service 315 returns a well-formatted JavaScript Object Notation (JSON) containing details such as, but not limited to item titles, images, price range, brand information, etc. The JSON data is presented to the user in the from of adaptive cards within the LLM-based bot framework 305, thereby facilitating the structured display of the search results.
[0070] In an embodiment, the users interact with the LLM-based bot framework 305 to view, explore, or select items. Upon selecting the items, they are directed to the corresponding item page on the website to finalize the processing. Further, the LLM agent 310 retains selected item details in its memory and registers every interaction between the user and the LLM-based bot framework 305 and allows users to return and inquire further about the selected items. The LLM agent 310 stores only the most recent chat texts/messages in the memory and efficiently manages token consumption while retaining recent interactions for contextually relevant user assistance.
[0071] In an embodiment, the backend data APIs 325 fetch information from the vector database 120 and retrieves specific data based on search queries provided by client applications. The APIs allow client applications to create new data entries or/and update existing ones in the backend data APIs 325. The backend data APIs 325 serve as the bridge between client applications and backend data stores, enabling continuous data access, manipulation, and management in modern software architectures.
[0072] In an embodiment, the batch processor 330 collects the group or/and batch of data items and processes them together as a single unit. The batch processor 330 contrasts with real-time processing 105 and tasks are executed immediately upon arrival. The batch processing 105 is frequently employed whenever efficiency is crucial and enabling system to handle substantial data volumes or/and execute demanding tasks in the regulated manner.

[0073] FIG. 4 is a flow chart illustrating a method 400 for processing search queries, according to various embodiments of the present invention. The method 400 is adapted for processing search queries. For the purpose of description, the method 400 is described with the embodiments as illustrated in FIG. 1 and should nowhere be construed as limiting the scope of the present disclosure.
[0074] At step 405, the method 400 includes the step of receiving the search query from the user via the UE 205. In an embodiment, the search query is inputted by the user via the UE 205 by input means including at least one of, touchscreen, keypad and voice instructions. In an embodiment, the search query is inputted by the user via the UE by input means including at least one of, touchscreen, keypad and voice instructions.
[0075] At step 410, the method 400 includes the step modifying utilizing a Machine Learning (ML) unit, characters of the search query based on one or more pre-defined rules. In an embodiment, the one or more pre-defined rules for modifying the characters of the search query include at least one of, spelling correction and synonym substitution.
[0076] At step 415, the method 400 includes the step of extracting at least one of, one or more categories, sub-categories and attributes, from the modified characters of the search query. In an embodiment, the step of extracting at least one of, one or more categories, sub-categories and attributes from the modified characters of the search query includes the steps of comparing keywords from the search query with a catalogue of categories, sub-categories and attributes to extract at least one of, the more categories, sub-categories and attributes.
[0077] At step 420, the method 400 includes the step, creating, utilizing an embedding model, vector embedding of the extracted at least one of, the one or more categories, sub-categories and attributes, from the modified characters of the search query.
[0078] At step 425, the method 400 includes the step of, generating one or more metadata filters for the extracted at least one of, the one or more categories, sub-categories and attributes.
[0079] At step 430, the method 400 includes the step of, storing, the generated one or more metadata filters along with the created vector embedding in a cache data store and a vector database 320.
[0080] At step 435, in response to receiving one or more subsequent search queries from the user, displaying, by the one or more processors (105), via the UE (205), one or more relevant search results pertaining to one or more items.
[0081] A person of ordinary skill in the art will readily ascertain that the illustrated embodiments and steps in description and drawings (FIG.1-4) are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[0082] The present disclosure incorporates technical advancement that facilitates AI/ML based approach that leverages advanced set well-defined conversation flows of query solution. Ability to understand natural language more accurately and in-context, developing personalized user experiences and also away from inflexible conversation flows to allow for more dynamic and natural interactions, resembling real conversations. Implementation of mechanisms to periodically update and refresh the items in the vector database to ensure users have access to the latest item offerings and also facilitates such as, but not limited to, contribute to creating the more dynamic, user-friendly, and effective conversational commerce system that meets the evolving expectations of modern online retailer user.
[0083] The present invention provides various advantages, including enhanced user experience, personalization, flexibility and adaptability, improved search accuracy, higher customer satisfaction. The conversational commerce system allows users to interact in the more natural and intuitive manner. By interpreting natural language inputs and understanding context, the system can provide personalized recommendations and suggestions tailored to each user preferences and constraints. Flexibility and adaptability may handle complex requests, varying preferences, and evolving user needs in real-time, offering the more flexible shopping experience compared to rigid navigation structures. Leveraging semantic search and vector embeddings, the system can better understand the meaning behind user queries. This allows for more accurate product matching based on user intent and context, moving away from purely keyword-driven results to more relevant product recommendations. By combining intuitive interaction, personalized recommendations, and accurate search capabilities, the conversational commerce system aims to enhance customer satisfaction.
[0084] The present invention offers multiple advantages over the prior art and the above listed are a few examples to emphasize on some of the advantageous features. The listed advantages are to be read in a non-limiting manner.

Claims:CLAIMS
We Claim:

1. A method (400) for processing search queries, the method comprises the steps of:
receiving, by one or more processors, a search query from a user via a User Equipment (UE) (205);
modifying, by the one or more processors, utilizing a Machine Learning (ML) unit, characters of the search query based on one or more pre-defined rules;
extracting, by the one or more processors (105), at least one of, one or more categories, sub-categories and attributes, from the modified characters of the search query;
creating, by the one or more processors (105), utilizing an embedding model, vector embedding of the extracted at least one of, the one or more categories, sub-categories and attributes, from the modified characters of the search query;
generating, by the one or more processors (105), one or more metadata filters for the extracted at least one of, the one or more categories, sub-categories and attributes;
storing, by the one or more processors (105), the generated one or more metadata filters along with the created vector embedding in a cache data store and a vector database (120); and
in response to receiving one or more subsequent search queries from the user, displaying, by the one or more processors (105), via the UE (205), one or more relevant search results pertaining to one or more items.

2. The method (400) as claimed in claim 1, wherein the search query is inputted by the user via the UE (205) by input means including at least one of, touchscreen, keypad and voice instructions.

3. The method (400) as claimed in claim 1, wherein the one or more pre-defined rules for modifying the characters of the search query include at least one of, spelling correction and synonym substitution.

4. The method (400) as claimed in claim 1, wherein the step of, extracting, at least one of, one or more categories, sub-categories and attributes, from the modified characters of the search query, includes the steps of:
comparing, keywords from the search query with a catalogue of categories, sub-categories and attributes to extract at least one of, the one or more categories, sub-categories and attributes.

5. The method (400) as claimed in claim 1, wherein the step of, in response to receiving one or more subsequent search queries from the user, displaying, via the UE, one or more relevant search results, includes the steps of:
identifying, by the one or more processors (105), one or more keywords from the one or more subsequent search queries;
checking, by the one or more processors (105), at the cache data store, if the one or more metadata filters and the vector embedding are present, based on the identified one or more keywords from the one or more subsequent search queries;
retrieving, by the one or more processors (105), from the cache data store, the one or more metadata filters and the vector embedding if determined present at the cache data store;
retrieving, by the one or more processors (105), from the vector database (120), the one or more metadata filters and the vector embedding, if determined not present at the cache data store; and
identifying, by the one or more processors (105), the one or more relevant search results from at least one of, the vector database (120) and one or more external databases based on retrieving the one or more metadata filters and the vector embedding.

6. The method (400) as claimed in claim 1, wherein the method further comprises the steps of:
re-ranking, by the one or more processors (105), the one or more search results in an order of relevance based on at least one of, frequency of occurrence of keywords from the one or more subsequent search queries, wherein the one or more search results are placed prior in the order which have a corresponding frequency of occurrence of keywords higher than the rest of the keywords for the given user.

7. The method (400) as claimed in claim 6, wherein the one or more search results are re-ranked by applying a weightage value for each of the one or more search results.

8. The method (400) as claimed in claim 1, wherein the method further comprises the step of:
providing, by the one or more processors (105), a conversational chat medium for the user via the UE to communicate with respect to search queries, feedback and complaints.

9. The method (400) as claimed in claim 1, wherein the step of, receiving, a search query from a user via a User Equipment (UE) (205), includes the steps of:
receiving, by the one or more processors (105), at least a portion of the search query from the user via the UE (205);
prompting, by the one or more processors (105), the user via the UE (205) to receive one or more portions of the search query from the user; and
receiving, by the one or more processors (105), the one or more portions of the search query based on prompting, thereby receiving the complete search query from the user.

10. The method (400) as claimed in claim 9, wherein prompting includes at least one of, requesting the user to answer questions based on the received at least the portion of the search query.

11. The method (400) as claimed in claim 1, wherein the method further includes the steps of:
periodically monitoring, by the one or more processors (105), one or more external databases to locate the one or more items which are not present in the vector database (120);
retrieving, by the one or more processors (105), the one or more items located from the one or more external databases;
converting, by the one or more processors (105), the format of the retrieved one or more items into an acceptable format;
periodic updating, by the one or more processors (105), the vector database (120) with the located one or more items.

12. A User Equipment (UE) (205) comprising:
one or more primary processors coupled with one or more memory units, wherein said one or more memory units store instructions which when executed by the one or more primary processors causes the UE (205) to:
transmit, a search query and one or more subsequent search queries to the one or more processors, wherein the one or more processors is further configured to perform the method as claimed in claim 1.

13. A system (100) for processing search queries, the system comprising:
a transceiver (130), configured to, receive, a search query from a user via a User Equipment (UE) (205);
a modifying unit (145), configured to, modify, utilizing a Machine Learning (ML) unit, characters of the search query based on one or more pre-defined rules;
an extraction unit (150), configured to, extract, at least one of, one or more categories, sub-categories and attributes, from the modified characters of the search query;
a creating unit (155), configured to, create, utilizing an embedding model, vector embedding of the extracted at least one of, the one or more categories, sub-categories and attributes, from the modified characters of the search query;
a generating unit (160), configured to, generate, one or more metadata filters for the extracted at least one of, the one or more categories, sub-categories and attributes;
a storing unit (165), configured to, store, the generated one or more metadata filters along with the created vector embedding in a cache data store and a vector database (120); and
in response to receiving one or more subsequent search queries from the user, a display module (198), configured to, display, via the UE (205), one or more relevant search results pertaining to one or more items.

Documents

Application Documents

# Name Date
1 202421058704-STATEMENT OF UNDERTAKING (FORM 3) [02-08-2024(online)].pdf 2024-08-02
2 202421058704-FORM 1 [02-08-2024(online)].pdf 2024-08-02
3 202421058704-FIGURE OF ABSTRACT [02-08-2024(online)].pdf 2024-08-02
4 202421058704-DRAWINGS [02-08-2024(online)].pdf 2024-08-02
5 202421058704-DECLARATION OF INVENTORSHIP (FORM 5) [02-08-2024(online)].pdf 2024-08-02
6 202421058704-COMPLETE SPECIFICATION [02-08-2024(online)].pdf 2024-08-02
7 Abstract.1.jpg 2024-08-16
8 202421058704-FORM-26 [29-08-2024(online)].pdf 2024-08-29
9 202421058704-Proof of Right [01-10-2024(online)].pdf 2024-10-01
10 202421058704-FORM 18 [21-10-2024(online)].pdf 2024-10-21
11 202421058704-FORM-9 [09-04-2025(online)].pdf 2025-04-09