Sign In to Follow Application
View All Documents & Correspondence

Query Response System

Abstract: A query response system (100). Further, the query response system (100) comprising: a user interface (102) configured to enable a user to provide an input 5 query; one or more databases (104) containing a plurality of container modules, wherein each of the container module is configured to store resources and data associated with different queries; and at least one processor (108) communicatively coupled with the user interface (102) and the one or more databases (104). Further, the at least one processor (108) is configured to receive the input query, perform a 10 vector similarity search by comparing the input query with the data and resources stored within at least one container module of the plurality of container modules, generate a response associated with the input query provided by the user, based at least on the comparison, and direct the user interface (102) to provide the generated response. 15 <>

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
27 November 2024
Publication Number
1/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

UPES
ENERGY ACRES, UPES, BIDHOLI, VIA, PREM NAGAR, UTTARAKHAND 248007

Inventors

1. SANJEEV KUMAR
UPES, ENERGY ACRES, BIDHOLI, VIA, PREM NAGAR, UTTARAKHAND 248007
2. KONAL PURI
UPES, ENERGY ACRES, BIDHOLI, VIA, PREM NAGAR, UTTARAKHAND 248007
3. ABHISHEK CHAMOLI
UPES, ENERGY ACRES, BIDHOLI, VIA, PREM NAGAR, UTTARAKHAND 248007

Specification

Description:QUERY RESPONSE SYSTEM
FIELD OF THE DISCLOSURE
[0001] This invention generally relates to a query response system, and in particular, to a system that facilitates efficient information retrieval and response generation. Moreover, the proposed invention pertains to an improved method for 5 utilizing vector similarity search techniques to enhance the accuracy and relevance of responses to user queries.
BACKGROUND
[0002] The subject matter discussed in the background section should not be 10 assumed to be prior art merely as a result of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the background section merely represents different approaches, which in and of themselves may also 15 correspond to implementations of the claimed technology.
[0003] The increasing complexity of chatbot interactions has necessitated the development of systems that may handle high volumes of concurrent user requests efficiently and at a low cost. Accurate and timely response generation is critical for maintaining user engagement and satisfaction in various applications, including 20 customer service, information retrieval, and virtual assistance.
[0004] Traditional chatbot architectures often face issues such as high response times, inefficient caching, and limited scalability. These limitations become more pronounced when chatbots rely on external AI models and large-scale similarity search mechanisms, which introduce additional latency and bottlenecks. For 25 instance, conventional methods involve querying databases directly or using basic keyword matching, which often result in slow response times and irrelevant answers.
3
[0005] Existing solutions typically fail to manage high-dimensional vector data in caching mechanisms effectively and do not provide a reliable method for balancing loads across multiple containers. This often leads to service degradation and delayed responses, particularly when AI-based responses need to be generated dynamically. Additionally, the inefficiency in handling large volumes of data and the inability to 5 scale seamlessly further exacerbate the problem.
[0006] According to patent application “US7849053B2” titled “Method and system for providing enhanced responses to user queries,” it discloses a method for providing responses to user queries by retrieving information from a knowledge base. However, the method described does not address the efficient management of 10 high-dimensional vector data or the integration of caching systems with vector similarity searches, leading to potential performance bottlenecks.
[0007] Another patent application, “WO2018146392A1” titled “Query response system using AI techniques,” discusses the use of artificial intelligence to generate responses to user queries. While it suggests using AI for improved relevance and 15 accuracy, it does not solve the issues of load balancing and efficient real-time processing, resulting in potential delays and inefficiencies.
[0008] Furthermore, traditional systems lack effective mechanisms for integrating caching systems and vector similarity search, leading to significant overhead during real-time processing. This is particularly problematic when dealing with dynamic 20 and high-dimensional data, which requires sophisticated handling to ensure prompt and accurate responses.
[0009] Therefore, there is a need for an improved method and device for query response systems that integrates semantic vector caching, load balancing, and real-time AI response generation. Such a system should minimize latency, enhance 25 scalability, and provide reliable and accurate responses to user queries efficiently. This invention addresses these issues by introducing a scalable and distributed chatbot system that offers superior performance and seamless integration of advanced technologies.
4
OBJECTIVES OF THE INVENTION
[0010] The objective of the invention is to provide a query response system capable of handling large volumes of requests efficiently, ensuring consistent performance even under high demand.
[0011] Furthermore, the objective of the invention is to minimize latency in query 5 responses by implementing semantic caching, enabling frequently asked queries to be processed quickly.
[0012] Furthermore, the objective of the present invention is to leverage distributed processing to optimize resource usage and ensure effective load distribution, enhancing overall system performance. 10
[0013] Furthermore, the objective of the present invention is to enable dynamic response generation to provide personalized and engaging interactions for users.
[0014] Furthermore, the objective of the present invention is to manage cache efficiently using an eviction policy that optimizes memory usage by automatically removing old or unused data. 15
[0015] Furthermore, the objective of the present invention is to provide context-aware responses by leveraging vector search capabilities, ensuring that responses are relevant and accurately informed.
[0016] Furthermore, the objective of the present invention is to offer a cost-effective solution that significantly reduces operational costs compared to existing methods, 20 providing substantial savings for handling high volumes of user queries.
5
SUMMARY
[0017] The present invention relates to a query response system.
[0018] According to an aspect, the present embodiments a query response system is disclosed. The query response system comprising: a user interface configured to enable a user to provide an input query, one or more databases containing a plurality 5 of container modules. Further, each of the container module is configured to store resources and data associated with different queries, and at least one processor communicatively coupled with the user interface and the one or more databases. Further, the at least one processor is configured to receive the input query, perform a vector similarity search by comparing the input query with the data and resources 10 stored within at least one container module of the plurality of container modules, generate a response associated with the input query provided by the user, based at least on the comparison, and direct the user interface to provide the generated response.
[0019] According to another aspect, the present embodiments further discloses that 15 the user interface corresponds to at least one messaging platform. Furthermore, a communication medium configured to transfer the input query from the user interface to the at least one processor. Further, each of the container modules facilitates the at least one processor to convert the input query into an embedding to perform comparison of the input query with the resources and data. Moreover, in 20 one instance, when the input query matches with the resources and data then, the at least one processor is configured to generate the response associated with the input query provided by the user, and direct the user interface to provide the generated response. Further, in another instance, when the input query mismatches with the resources and data then, the at least one processor is configured to perform another 25 vector similarity search by comparing the input query with the data and resources stored within an another container module of the plurality of container modules.
[0020] According to another aspect, the present embodiments further discloses, at least one artificial intelligence (AI) module that facilitates the at least one processor to retrieve data associated with the input query and generate the response. Further, 30
6
the at least one processor is configured to log the generated response in a database of the one or more databases for future reference. Further, the input query comprises at least one of an informational queries, data retrieval requests, media searches, and test execution commands.
[0021] According to another main aspect, the present embodiments further disclose 5 a method for operating the query response system. Further, the method comprising several steps: providing, via a user interface, an input query; containing, via one or more databases, a plurality of container modules, further each of the container module is configured to store resources and data associated with different queries; receiving, via at least one processor communicatively coupled with the user 10 interface and the one or more databases, the input query; performing, via the at least one processor, a vector similarity search by comparing the input query with the data and resources stored within at least one container module of the plurality of container modules; generating, via the at least one processor, a response associated with the input query provided by the user, based at least on the comparison; and 15 directing, via the at least one processor, the user interface to provide the generated response.
7
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The accompanying drawings illustrate various embodiments of systems, methods, and embodiments of various other aspects of the disclosure. Any person with ordinary skills in the art will appreciate that the illustrated element boundaries (e.g. boxes, groups of boxes, or other shapes) in the figures represent one example 5 of the boundaries. It may be that in some examples one element may be designed as multiple elements or that multiple elements may be designed as one element. In some examples, an element shown as an internal component of one element may be implemented as an external component in another, and vice versa. Furthermore, elements may not be drawn to scale. Non-limiting and non-exhaustive descriptions 10 are described with reference to the following drawings. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating principles.
[0023] FIG. 1 illustrates a block diagram of a query response system, according to an embodiment of the present invention; and 15
[0024] FIGS. 2-4 illustrate flow charts of a method for operating the query response system, according to an embodiment of the present invention.
8
DETAILED DESCRIPTION
[0025] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words “comprising,” “having,” “containing,” and “including,” and other forms thereof, are intended to be equivalent in meaning and 5 be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. 10
[0026] Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the preferred, systems and methods are now described. Embodiments of the present disclosure will be described more fully hereinafter with reference to the accompanying drawings in which like numerals represent like elements throughout 15 the several figures, and in which example embodiments are shown. Embodiments of the claims may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. The examples set forth herein are non-limiting examples and are merely examples among other possible examples. 20
[0027] The present invention discloses about a query response system, and in particular, to a system that facilitates efficient information retrieval and response generation. Moreover, the proposed invention is capable of handling large volumes of requests efficiently, ensuring consistent performance even under high demand.
[0028] FIG. 1 illustrates a block diagram of a query response system (100), 25 according to an embodiment of the present invention.
[0029] In some embodiments, the query response system (100) comprises a user interface (102), one or more databases (104), a communication medium (106), and at least one processor (108). The user interface (102) is installed within a computing unit. Further, the user interface (102) corresponds to a mobile application. The user 30 interface (102) comprises a graphical interface having a plurality of dynamic
9
components such as, virtual buttons, lists, images, icons, interactive menus etc. In some embodiments, the user interface (102) is installed within the computing unit. Further, the computing unit corresponds to a mobile phone, a touch enabled display, a tablet, or any other portable computing device capable of running the user interface (102). In some embodiments, the user interface (102) is specifically design to adapt 5 various screen sizes and orientations such as smartphones, tablets, and touch-enabled displays.
[0030] In an example embodiment, the user interface (102) is capable of displaying texts in various languages based on user preferences and permissions. Additionally, the user interface (102) incorporates various secure login mechanisms, such as 10 biometric authentication (fingerprint and facial recognition) and two-factor authentication, to protect user data and privacy. In some embodiments, the user interface (102) corresponds to at least one messaging platform.
[0031] In some embodiments, the user interface (102) is configured to enable the user to provide an input query through various methods. Further, the user interface 15 (102) enables the user to provide the input query by offering multiple input modalities, including a textual input via a keyboard, vocal input through speech recognition, and touch gestures. Further, the input query is required to be entered in natural language, allowing the user to communicate with the system (100) in a conversational manner. 20
[0032] In some embodiments, the input query corresponds to various types of requests, including informational queries, data retrieval requests, media searches. and task execution commands. Further, the user interface (102) supports a context-aware input, where user is capable of refining the input query, based at least on previous interactions. 25
[0033] Moreover, the system (100) may comprise the one or more databases (104). Further, the one or more databases (104) corresponds to a FastAPI- based architecture. In some embodiments, each of the one or more databases (104) contains a plurality of container modules. Further, each of the container module is configured to store resources and data associated with different queries, facilitating 30 an efficient data retrieval and management. In some embodiments, each of the
10
container module is designed to store a specific type of data, such as text, images, or structured data, optimizing the storage and retrieval processes based on the type of input query.
[0034] In some embodiments, each of the one or more databases (104) are designed to support high-availability and fault-tolerance features, employing replication and 5 sharding mechanism to distribute data across multiple nodes. In some embodiments, the input queries are distributed across each of the plurality of container modules through at least one orchestration engine (e.g., Kubernetes or alike).
[0035] Furthermore, the system (100) comprises a communication medium (106). Further, the communication medium (106) corresponds to a Webhook mechanism. 10 Further, the communication medium (106) is configured to transfer the input query provided by the user through the user interface (102) to the at least one processor (108). Further, the communication medium (106) comprises a lightweight, event-driven API that facilitates a real-time communication between different components of the system (100). Further, the communication medium (106) is configured to 15 operate by sending an HTTP POST request from the user interface (102) to the at least one processor (108) or backend system whenever a new input query is provided by the user.
[0036] Moreover, the system (100) comprises the at least one processor (108). In one embodiment, the at least one processor (108) may be communicatively coupled 20 to the user interface (102) through the communication medium (106). The at least one processor (108) may include suitable logic, input/ output circuitry, and communication circuitry that are operable to execute one or more instructions stored in a memory to perform predetermined operations. In one embodiment, the at least one processor (108) may be configured to decode and execute any instructions 25 received from one or more other electronic devices or server(s). The at least one processor (108) may be configured to execute one or more computer-readable program instructions, such as program instructions to carry out any of the functions described in this description. Further, the at least one processor (108) may be implemented using one or more processor technologies known in the art. Examples 30 of the at least one processor (108) include, but are not limited to, one or more general
11
purpose processors and/or one or more special purpose processors such as NGINX Ingress controller.
[0037] In one embodiment, the memory may be configured to store a set of instructions and data executed by the at least one processor (108). Further, the memory may include the one or more instructions that are executable by the at least 5 one processor (108) perform specific operations.
[0038] In some embodiments, the at least one processor (108) is configured to receive the input query through the communication medium (106). In some embodiments, the at least one processor (108) is configured to receive and parsing the incoming input query data. Further, upon receiving the input query, the at least 10 one processor (108) is configured to perform a victor similarity search by comparing the input query with the data and resources stored within at least one container module of the plurality of container modules. The vector similarity search is implemented using an embedding-based approach, where the input query is first converted into a vector representation using a pre-trained language model or 15 specialized embedding approach.
[0039] Each container module of the plurality of container modules contains resources, such as datasets, pre-processed query vectors, or domain-specific knowledge. Further, the plurality of container modules are indexed, such that the at least one processor (108) determines similar resources and data corresponding to 20 the input query.
[0040] The system (100) also incorporates a querying cache approach. Further, the querying cache approach corresponds a high-speed in-memory data store. The querying cache approach enables the at least one processor (108) to quickly access frequently accessed data and pre-processed query vectors. Further, the querying 25 cache approach ensures a reduced latency and improved response times.
[0041] For example, when an input query is received, the at least one processor (108) first checks the cache from a cache source to determine any historical records similar to the input query. Further, if a match is found in the cache source, the at least one processor (108) is configured to retrieve the corresponding response 30
12
directly from the cache source, thereby reducing a time required to generate a response.
[0042] Moreover, in one instance, when the input query matches with the resources and data either from the cache source or the indexed container modules then, the at least one processor (108) is configured to generate the response associated with the 5 input query provided by the user and direct the user interface (102) to provide the generated response. Further, the system (100) comprises at least one artificial intelligence (AI) module (110). Further, the at least one AI module (110) facilitates the at least one processor (108) to retrieve data associated with the input query and generate the response. Further, at least one AI module (110) comprises advanced 10 algorithms and models to interpret the query contextually, enhance the accuracy of the retrieved information, and refine the response to ensure the input query is resolved. Further, the at least one processor (108) is configured to direct the user interface (102) to provide the generated response.
[0043] Moreover, in another instance, when the input query mismatches with the 15 resources and data then, the at least one processor (108) is configured to perform another vector similarity search by comparing the input query with the data and resources stored within another container module of the plurality of container modules. Further, if the input query matches with the resources and data of the another container module, the at least one processor (108) is configured to generate 20 the response. Further, if the input query mismatches with the resources and data of the another container module, then the at least one AI module (110) is configured to generate an auxiliary database (e.g., pinecone) to retrieve similar response associated with the input query. Thereafter, the at least one processor (108) directs the user interface (102) to present the generated response. 25
[0044] FIGS. 2-4 illustrate flow charts of a method (200) for operating the query response system (100), according to an embodiment of the present invention.
[0045] In some embodiments, the method (200) comprises several steps for operating the query response system (100). At a step, the user interface (102) is configured to enable the user to provide the input query. In some embodiments, the 30 input query corresponds to various types of requests, including informational
13
queries, data retrieval requests, media searches. and task execution commands. In some embodiments, the user interface (102) is installed within the computing unit. Further, the computing unit corresponds to a mobile phone, a touch enabled display, a tablet, or any other portable computing device capable of running the user interface (102). 5
[0046] At another step, the one or more databases (104) are configured to contain the plurality of container modules. Further, each of the container module is configured to store resources and data associated with different queries. In some embodiments, each of the container module is designed to store a specific type of data, such as text, images, or structured data, optimizing the storage and retrieval 10 processes based on the type of input query. In some embodiments, the input queries are distributed across each of the plurality of container modules through at least one orchestration engine (e.g., Kubernetes or alike).
[0047] At another step, the at least one processor (108) communicatively coupled with the user interface (102) and the one or more databases (104). Further, the at 15 least one processor (108) is configured to receive the input query. In some embodiments, the at least one processor (108) is configured to receive and parsing the incoming input query data.
[0048] At another step, the at least one processor (108) is configured to perform the vector similarity search by comparing the input query with the data and resources 20 stored within at least one container module of the plurality of container modules. The vector similarity search is implemented using an embedding-based approach, where the input query is first converted into a vector representation using a pre-trained language model or specialized embedding approach. Further, the plurality of container modules are indexed, such that the at least one processor (108) 25 determines similar resources and data corresponding to the input query.
[0049] At another step, the at least one processor (108) is configured to generate the response associated with the input query provided by the user, based at least on the comparison. In one instance, when the input query matches with the resources and data either from the cache source or the indexed container modules then, the at least 30 one processor (108) is configured to generate the response associated with the input
14
query provided by the user. At another step, the at least one processor (108) is configured to direct the user interface (102) to provide the generated response.
[0050] It should be noted that the query response system (100) in any case could undergo numerous modifications and variants, all of which are covered by the same innovative concept; moreover, all of the details can be replaced by technically 5 equivalent elements. In practice, the components used, as well as the numbers, shapes, and sizes of the components can be whatever according to the technical requirements. The scope of protection of the invention is therefore defined by the attached claims.
10
Dated this 27th day of November, 2024
Ishita Rustagi (IN-PA/4097) 15 Agent for Applicant , Claims:CLAIMS We Claim:
1. A query response system (100) comprising:
a user interface (102) configured to enable a user to provide an input query; 5
one or more databases (104) containing a plurality of container modules, wherein each of the container module is configured to store resources and data associated with different queries; and
at least one processor (108) communicatively coupled with the user interface (102) and the one or more databases (104), wherein the at least one 10 processor (108) is configured to:
receive the input query,
perform a vector similarity search by comparing the input query with the data and resources stored within at least one container module of the plurality of container modules, 15
generate a response associated with the input query provided by the user, based at least on the comparison, and
direct the user interface (102) to provide the generated response.
20
2. The system (100) as claimed in claim 1, wherein the user interface (102) corresponds to at least one messaging platform.
3. The system (100) as claimed in claim 1 further comprising a communication medium (106) configured to transfer the input query from the user interface 25 (102) to the at least one processor (108).
4. The system (100) as claimed in claim 1, wherein each of the container modules facilitates the at least one processor (108) to convert the input query
16
into an embedding to perform comparison of the input query with the resources and data.
5. The system (100) as claimed in claim 4, wherein in one instance, when the input query matches with the resources and data then, the at least one 5 processor (108) is configured to generate the response associated with the input query provided by the user, and direct the user interface (102) to provide the generated response.
6. The system (100) as claimed in claim 4, wherein in another instance, when 10 the input query mismatches with the resources and data then, the at least one processor (108) is configured to perform another vector similarity search by comparing the input query with the data and resources stored within an another container module of the plurality of container modules.
15
7. The system (100) as claimed in claim 1, further comprises at least one artificial intelligence (AI) module (110) that facilitates the at least one processor (108) to retrieve data associated with the input query and generate the response.
20
8. The system (100) as claimed in claim 1, wherein the at least one processor (108) is configured to log the generated response in a database of the one or more databases (104) for future reference.
9. The system (100) as claimed in claim 1, wherein the input query comprises 25 at least one of an informational queries, data retrieval requests, media searches, and test execution commands.
17
10. A method (200) for operating the query response system (100), the method (200) comprising:
providing, via a user interface (102), an input query;
containing, via one or more databases (104), a plurality of container modules, wherein each of the container module is configured to store 5 resources and data associated with different queries;
receiving, via at least one processor (108) communicatively coupled with the user interface (102) and the one or more databases (104), the input query;
performing, via the at least one processor (108), a vector similarity 10 search by comparing the input query with the data and resources stored within at least one container module of the plurality of container modules;
generating, via the at least one processor (108), a response associated with the input query provided by the user, based at least on the comparison; and 15
directing, via the at least one processor (108), the user interface (102) to provide the generated response.
Dated this 27th day of November, 2024 20
Ishita Rustagi (IN-PA/4097) Agent for Applicant

Documents

Application Documents

# Name Date
1 202411092659-STATEMENT OF UNDERTAKING (FORM 3) [27-11-2024(online)].pdf 2024-11-27
2 202411092659-REQUEST FOR EXAMINATION (FORM-18) [27-11-2024(online)].pdf 2024-11-27
3 202411092659-REQUEST FOR EARLY PUBLICATION(FORM-9) [27-11-2024(online)].pdf 2024-11-27
4 202411092659-PROOF OF RIGHT [27-11-2024(online)].pdf 2024-11-27
5 202411092659-POWER OF AUTHORITY [27-11-2024(online)].pdf 2024-11-27
6 202411092659-FORM-9 [27-11-2024(online)].pdf 2024-11-27
7 202411092659-FORM-8 [27-11-2024(online)].pdf 2024-11-27
8 202411092659-FORM 18 [27-11-2024(online)].pdf 2024-11-27
9 202411092659-FORM 1 [27-11-2024(online)].pdf 2024-11-27
10 202411092659-FIGURE OF ABSTRACT [27-11-2024(online)].pdf 2024-11-27
11 202411092659-DRAWINGS [27-11-2024(online)].pdf 2024-11-27
12 202411092659-DECLARATION OF INVENTORSHIP (FORM 5) [27-11-2024(online)].pdf 2024-11-27
13 202411092659-COMPLETE SPECIFICATION [27-11-2024(online)].pdf 2024-11-27