Sign In to Follow Application
View All Documents & Correspondence

System And Method For Translating One Or More Queries

Abstract: System and method for translating one or more queries are described. In one example, the method may include receiving a query from a user associated with a computing device. The query may include a plurality of phrases in a primary language and a secondary language. Further, the query may be directed towards a purchase of a product by the user on a digital platform. The method may further include training a query translation model with a corpus data and a brand name list based on the primary language and the secondary language, and analyzing, using the trained query translation model, the plurality of phrases in the primary language and the secondary language. The method may furthermore include, in response to the analyzing, generating a selectively translated query in the primary language while retaining a brand name and a semantic representation of the query in the primary language.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 March 2023
Publication Number
40/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Flipkart Internet Private Limited
Building Alyssa Begonia & Clover, Embassy Tech Village, Outer Ring Road, Devarabeesanahalli Village, Bengaluru - 560103, Karnataka, India.

Inventors

1. KULKARNI, Mandar
Flipkart Internet Private Limited, Building Alyssa Begonia & Clover, Embassy Tech Village, Outer Ring Road, Devarabeesanahalli Village, Bengaluru – 560103, Karnataka, India.
2. KUMAR, Shubham
Flipkart Internet Private Limited, Building Alyssa Begonia & Clover, Embassy Tech Village, Outer Ring Road, Devarabeesanahalli Village, Bengaluru – 560103, Karnataka, India.
3. GARERA, Nikesh
Flipkart Internet Private Limited, Building Alyssa Begonia & Clover, Embassy Tech Village, Outer Ring Road, Devarabeesanahalli Village, Bengaluru – 560103, Karnataka, India.

Specification

Description:TECHNICAL FIELD
[0001] The present disclosure, in general, relates to query translation models, and in particular, relates to approaches for translating one or more queries with phrases in multiple languages while retaining a certain portion of the query.

BACKGROUND
[0002] The following description of the related art is intended to provide background information pertaining to the field of the disclosure. This section may include certain aspects of the art that may be related to various features of the present disclosure. However, it should be appreciated that this section be used only to enhance the understanding of the reader with respect to the present disclosure, and not as admissions of the prior art.
[0003] With the technological advancement and increase in urbanization, various individuals may increasingly use electronic marketplaces to shop for various items. Further, the unavailability and paucity of time to visit physical department stores, the convenience and ease in accessing electronic marketplaces, availability of a large products across the electronic marketplaces, etc., may also cause the customers to transition towards electronic marketplaces. Electronic marketplaces, for example, may refer to e-commerce websites, e-commerce mobile applications, electronic marketplaces of departmental stores, etc.
[0004] Generally, an electronic marketplace may be designed based on a specific language. In most of the cases, this language may be English. The marketplace may operate on and support such specific language only. For example, to access any product on the electronic marketplace, the user may need to search the product on the marketplace in a term in marketplace-specific language, ‘English’ in most cases.
[0005] However, owing to the diversity of individuals using the electronic marketplace, it might be the case that the individual searching for any product on the electronic marketplace may use a word in any other language. Generally, such other language may be a local or regional language used by the individual. In such cases, the individual may search the product on the marketplace using different words as a combination of the marketplace-specific language and the local language. Generally, the search word may pertain to the local language, however, written in the script of marketplace-specific language.
[0006] For example, in the case of the local language being ‘Hindi,’ the user may intend to search for ‘cheap tea’ on the marketplace. In such cases, the user may search it as ‘sasti tea’ on the electronic marketplace. The term ‘sasti’ comes from Hindi language translation of the term ‘cheap’ written in English language script.
[0007] Conventionally, to address such issue and to search for products on the marketplace based on a multiple language query, translation may be used. When the user uses a combination of words from multiple languages as an input to the search query, a language detection module may detect the language of the words of the query. The words are then translated to the language supported by the electronic marketplace, and then the search is performed for the products on the marketplace.
[0008] However, such existing approaches, to address the multi-language query, may be inefficient. For example, it may be the case that the products which a user may be searching for on the marketplace may include some brand names as well. Such brand names may be of any one or multiple combinations of languages. Therefore, the user, while searching for such product, may use brand name as well as multi-language keywords. As a result, while translating and processing such queries, it may be required to translate the product names and preserve the brand names to implement the search of the product on the marketplace efficiently.
[0009] The existing translational models and language detection models may be inefficient to distinctly distinguish between product words and brand names. As a result, wrong products may be displayed and the user may be unable to find suitable products on the electronic marketplace.
[0010] Therefore, there exists a need for approaches where the queries may be efficiently translated while preserving certain terms, such as brand names, which may need to be avoided.
SUMMARY
[0011] This section is provided to introduce certain objects and aspects of the present invention in a simplified form that are further described below in the detailed description. This summary is not intended to identify the key features or the scope of the claimed subject matter.
[0012] Aspects of the present disclosure relate to query translation models. In particular, the present disclosure provides approaches for translating one or more queries with phrases in multiple languages while retaining a certain portion of the query.
[0013] An embodiment of the present disclosure pertains to a system for translating one or more queries. The system may include one or more processors operatively coupled with a digital platform. The one or more processors may be coupled with a memory, and the memory may store instructions. The instructions when executed by the one or more processors may cause the system to receive a query from a user. The user may be associated with a computing device, which in turn may be operably coupled to the digital platform via a network. Further, the query may include a plurality of phrases in a primary language and a secondary language. Furthermore, the query may be directed towards a purchase of a product by the user on the digital platform. Thereafter, a query translation model may be trained with a corpus data and a brand name list based on the primary language and the secondary language. Using the trained query translation model, the plurality of phrases in the primary language and the secondary language may then be analysed. Thereafter, in response to the analysis, a selectively translated query may be generated in the primary language while retaining a brand name and a semantic representation of the query in the primary language.
[0014] In one aspect, the one or more processors may be configured to utilize the brand name list to compute a brand loss and a scaling factor associated with the query translation model.
[0015] In another aspect, the one or more processors may be configured to utilize any or a combination of a supervised loss and a data augmentation loss with the computed brand loss and the scaling factor to train the query translation model.
[0016] In yet another aspect, the brand loss may be a cross entropy loss based on a plurality of brand names associated with the digital platform.
[0017] In yet another aspect, the semantic representation of the query may include description of the query using the plurality of phrases in the primary language.
[0018] In yet another aspect, the one or more processors may be configured to utilize a plurality of transliterations and translations associated with the secondary language to generate the corpus data required to train the query translation model.
[0019] Another embodiment of the present disclosure pertains to a method for translating one or more queries. The method may include receiving, by a system, a query from a user. The user may be associated with a computing device. Further, the query may include a plurality of phrases in a primary language and a secondary language. Furthermore, the query may be directed towards a purchase of a product by the user on the digital platform. The method may further include training, by the system, a query translation model with a corpus data and a brand name list based on the primary language and the secondary language. Thereafter, the method may include analyzing, by the system, using the trained query translation model, the plurality of phrases in the primary language and the secondary language. Thereafter, in response to the analysis, the method may include generating, by the system, a selectively translated query in the primary language while retaining a brand name and a semantic representation of the query in the primary language.
[0020] In one aspect, the method may include utilizing, by the system, the brand name list to compute a brand loss and a scaling factor associated with the query translation model.
[0021] In another aspect, the method may include utilizing, by the system, any or a combination of a supervised loss and a data augmentation loss with the computed brand loss and the scaling factor to train the query translation model.
[0022] In yet another aspect, the semantic representation of the query may include description of the query using the plurality of phrases in the primary language.
[0023] In yet another aspect, the brand loss may be a cross entropy loss based on a plurality of brand names associated with the digital platform.
[0024] In yet another aspect, the method may include utilizing, by the system, a plurality of transliterations and translations associated with the secondary language to generate the corpus data required to train the query translation model.
[0025] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF THE DRAWINGS
[0026] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0027] FIG. 1 illustrates an exemplary network environment 100 with a system for translating one or more queries, in accordance with an embodiment of the present disclosure;
[0028] FIG. 2 illustrates an exemplary block diagram 200 representing functional units of the system, in accordance with an embodiment of the present disclosure;
[0029] FIG. 3 illustrates an exemplary flow diagram representing steps of a method 300 for translating one or more queries, in accordance with an embodiment of the present disclosure; and
[0030] FIG. 4 illustrates an exemplary computer system 400 in which or with which embodiments of the present disclosure may be utilized, in accordance with embodiments of the present disclosure.
DETAILED DESCRIPTION
[0031] In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not address all of the problems discussed above or might address only some of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein.
[0032] The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth.
[0033] Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
[0034] Also, it is noted that individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
[0035] The word “exemplary” and/or “demonstrative” is used herein to mean serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “exemplary” and/or “demonstrative” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques known to those of ordinary skill in the art. Furthermore, to the extent that the terms “includes,” “has,” “contains,” and other similar words are used in either the detailed description or the claims, such terms are intended to be inclusive—in a manner similar to the term “comprising” as an open transition word—without precluding any additional or other elements.
[0036] As used herein, “'connect”, “configure”, “couple” and its cognate terms, such as “connects”, “connected”, “configured”, and “coupled” may include a physical connection (such as a wired/wireless connection), a logical connection (such as through logical gates of semiconducting device), other suitable connections, or a combination of such connections, as may be obvious to a skilled person.
[0037] As used herein, “send”, “transfer”, “transmit”, and their cognate terms like “sending”, “sent”, “transferring”, “transmitting”, “transferred”, “transmitted”, etc. include sending or transporting data or information from one unit or component to another unit or component, wherein the content may or may not be modified before or after sending, transferring, transmitting.
[0038] Reference throughout this specification to “one embodiment” or “an embodiment” or “an instance” or “one instance” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0039] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
[0040] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such details as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosures as defined by the appended claims.
[0041] The approaches of the present subject matter provide an efficient, robust, and convenient way for translating one or more queries with phrases in multiple languages while retaining a certain portion of the query. As would be appreciated, a user may be able to search for an item in a database, such as an electronic marketplace, using phrases that include terms from multiple languages. The approaches of the present subject matter may only translate those terms that need to be translated, while preserving those terms that do not require translation.
[0042] Such approaches may be advantageous when the user may be searching for a product on an electronic marketplace that includes brand name as well. As would be understood, the brand name of the product may be in any language. Despite being in any language, the brand name may need to be avoided from translation while searching the query.
[0043] As would be further appreciated, in the proposed approach, a query translation model may be trained in a customized manner to translate the multi-language terms and preserve certain terms in the translation.
[0044] The manner in which the proposed system is used for translating one or more queries with phrases in multiple languages while retaining a certain portion of the query is further explained in detail with respect to FIGs. 1-4. It is to be noted that drawings of the present subject matter shown here are for illustrative purposes only and are not to be construed as limiting the scope of the subject matter claimed. Further, FIGs. 1-2 have been explained together, and same reference numerals have been used to refer to identical components and entities.
[0045] FIG. 1 illustrates an exemplary network environment 100 including a system 102. The system 102 may be used for translating one or more queries with phrases in multiple languages while retaining a certain portion of the query, in accordance with an embodiment of the present disclosure. In an embodiment, the system 102 may be implemented as any hardware-based, software-based, or network-based computing device known to a person skilled in the art. Such explanation has not been provided here for the sake of brevity. It may be further noted that the system 102 may be implemented as any system capable of receiving an input, processing it, and generating an output.
[0046] Continuing further, as depicted in FIG. 1, the network environment 100 may include a centralized server 104 in communication with the system 102 over a network 106. In an embodiment, the centralized server 104 may be implemented using any or a combination of hardware-based, software-based, network-based computing device, or a cloud-based computing device.
[0047] In an embodiment, the network 106 may be a wireless network, a wired network, or a combination thereof that can be implemented as one of the different types of networks, such as Intranet, Local Area Network (LAN), Wide Area Network (WAN), Internet, and the like. Further, the network 106 may either be a dedicated network or a shared network. The shared network may represent an association of different types of networks that can use variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like.
[0048] Referring to FIG. 1, the centralized server 104 may include at least one of a plurality of processor(s) 108, referred to as processor 108 and a query translation model 110. Although, as depicted in FIG. 1, the processor 108 and the query translation model 110 may be present within the centralized server 104 and may be in communication with the system 102 over the network 106, however, the same should not be construed to limit the scope of the present subject matter in any manner. The processor 108 and/or the query translation model 110 may be present within the system 102 as well, as would be explained with reference to FIG. 2. Such an example would also lie within the scope of the present subject matter.
[0049] The processor 108 may be implemented as any processing resource, and the query translation model 110 may be implemented as a Machine Learning model. Such implementations may be done in a manner known to a person skilled in the art. Such explanations have not been provided here for the sake of brevity.
[0050] As further depicted in FIG. 1, the system 102 may be in communication with a plurality of computing devices 112-1, 112-2…112-N (collectively and individually referred to as computing device 112). Examples of such computing device 112 may include, but are not limited to, a Personal Computer (PC), a portable PC, a mobile phone, a handheld computer, a laptop, a wearable device, or any other end-user computing device. Each computing device 112 may be associated with, and operated by a respective user 114. It may be noted that aforementioned examples of the computing device 112 are only illustrative, and should not be construed to limit the scope of the present subject matter in any manner. Any other type of end-user device used by the user 114 would also lie within the scope of the computing device 112 and the present subject matter.
[0051] Continuing further, each of the computing devices 112 may be operably coupled to a digital platform (not depicted in FIG. 1) via the network 106. The digital platform, in one example, may be implemented by a centralized server, similar to that of the centralized server 104. Such explanation has not been provided again for the sake of brevity.
[0052] In another example, the digital platform may be understood as an electronic marketplace. In yet another example, the digital platform may be understood as and implemented as an e-commerce platform. However, all such examples are only illustrative, and the digital platform may be implemented as any database-based search portal, which the user 114 may be able to access and find a product or a result.
[0053] Continuing further, in one example, the computing device 112 may include a display device and an input device (not shown in FIG. 1) coupled to the computing device 112. In another example, in the cases of the computing device 112 being a portable computing device such as a portable computer and a mobile phone, the display device and the input device may be present within the computing device 112 itself. The display device may be able to render the digital platform, and the input device may allow the user 114 to provide an input. However, it may be noted that all such examples are only illustrative.
[0054] In yet another example, the computing device 112 may include a user interface for enabling the user 114 to operate the digital platform and the computing device 112. In yet another example, the user 114 may have a set of credentials to login into the system 102. It may be again noted that all such examples are only illustrative, and the proposed subject matter may be implemented in any manner known to a person skilled in the art.
[0055] A person of ordinary skill in the art will appreciate that the network environment 100 may be modular and flexible to accommodate any kind of changes in the network environment 100.
[0056] The working of the system 102 is explained in conjunction with FIG. 2. FIG. 2 illustrates a block diagram 200 representing functional units of the proposed system for translating one or more queries with phrases in multiple languages while retaining a certain portion of the query, such as system 102.
[0057] As depicted in FIG. 2, the exemplary functional units of the system 102 may include one or more processor(s) 108. The one or more processor(s) 108 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 108 may be configured to fetch and execute computer-readable instructions stored in a memory 202 of the system 102. The memory 202 may store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 202 may include any non-transitory storage device including, for example, volatile memory such as Random-access Memory (RAM), or non-volatile memory such as Erasable Programmable Read-only Memory (EPROM), flash memory, and the like.
[0058] In an embodiment, the system 102 may also include an interface(s) 204. The interface(s) 204 may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 204 may facilitate communication of the system 102 with various devices coupled to the system 102. The interface(s) 204 may also provide a communication pathway for one or more components of the system 102. Examples of such components include, but are not limited to, processing engine(s) 206 and database 208.
[0059] In an embodiment, the processing engine(s) 206 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the system 102. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 206 may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 206 may include a processing resource (for example, one or more processors, such as processor 108), to execute such instructions.
[0060] In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) 206. In such examples, the system 102 may include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the system 102 and the processing resource. In other examples, the processing engine(s) 206 may be implemented by electronic circuitry. In an embodiment, the database 208 may include data that is either stored or generated as a result of functionalities implemented by the processing engine(s) 206. In an embodiment, the processing engine(s) 206 may include a query translation model 110. In another embodiment, the query translation model 110 may be implemented as a Machine Learning model. In yet another embodiment, the processor 108 may be implemented within the processing engine(s) 206.
[0061] Continuing with the approaches of the working of the present subject matter, it may be noted that although the foregoing description would be explained with respect to a single computing device 112 being operated by a single user 114, it may be noted that the same is done only for the sake of clarity. The proposed approach for translating one or more queries with phrases in multiple languages while retaining a certain portion of the query may be implemented to any number of users 114 operating any number of computing devices 112. All such examples would be covered within the scope of the present subject matter. Further, it may be the case that a single user 114 may be operating multiple accounts, where each account may be in communication with the digital platform. All such examples would lie within the scope of the present subject matter.
[0062] Continuing further, in one example, a user 114 may be associated with and operating the computing device and the digital platform. The user 114 may generate a query. As would be understood, in the context of the present example, the query may refer to keywords generated by the user 114 for searching a particular product on the digital platform.
[0063] In operation, the processor 108 may receive the query from the user 114. The query may include a plurality of phrases in a primary language and a secondary language. For example, the primary language may be the language on which the digital platform may be built, and which the digital platform may support. On the other hand, the secondary language may be a local language or a regional language that may be used by the user 114. Further, the query may be directed towards a purchase of a product by the user 114 on the digital platform.
[0064] Thereafter, the processor 108 may train the query translation model 110 with a corpus data and a brand name list based on the primary language and the secondary language. In one example, the query translation model 110 may be implemented as a Machine Learning model, known to a person skilled in the art. In another example, a public pre-trained encoder-decoder model, such as text-to-text transfer transformer (T5), bidirectional encoder representations from transformers (BERT), etc., may be used as the query translation model 110. However, it may be noted that all such implementations of the query translation model 110 are only exemplary, and any other type of technique or Machine Learning model known to a person skilled in the art may also be used for implementing the query translation model 110. All such examples would lie within the scope of the present subject matter.
[0065] Continuing further, in another example, the processor 108 may utilize a plurality of transliterations and translations associated with the secondary language to generate the corpus data required to train the query translation model 110. As described previously, the secondary language may refer to the local or regional language that may be used by the user 114 while generating the query. In yet another example, the processor 108 may then fine-tune the trained query translation model 110 on small set of manually tagged specific product search queries in the secondary language.
[0066] In yet another example, the processor 108 may utilize the brand name list to compute a brand loss and a scaling factor associated with the query translation model 110. The brand loss may be a cross entropy loss based on a plurality of brand names associated with the digital platform. As would be appreciated, such utilization of the brand name list to compute the brand loss may enable the trained query translation model 110 to efficiently identify brand names that may need to be preserved while translating the queries.
[0067] Continuing further, in such cases, any or a combination of a supervised loss and a data augmentation loss may be utilized with the computed brand loss and the scaling factor to train the query translation model 110. As would be further appreciated, brand loss with a scaling factor may be added to the earlier supervised and data augmentation losses and the query translation model 110 may be trained to reduce the combined loss function during fine-tuning. In yet another example, data augmentations such as character drops, masking, auto-encoder, etc. may be used. As would be appreciated, such techniques may improve the query translation. In yet another example, to reduce the latency of the query translation model 110, knowledge distillation and weight quantization may be used.
[0068] Continuing further, after the query translation model 110 has been trained, the processor 108, using the trained query translation model 110, may analyze the plurality of phrases in the primary language and the secondary language. As would be understood, such phrases may refer to the phrases included in the user query in the form of the primary language and the secondary language.
[0069] Thereafter, in response to the analysis, the processor 108 may generate a selectively translated query in the primary language while retaining a brand name and a semantic representation of the query in the primary language. In one example, the semantic representation of the query may include description of the query using the plurality of phrases in the primary language.
[0070] As would be appreciated, retaining the brand name may allow the system 102 to efficiently search the digital platform for the exact product that the user 114 intends to search for.
[0071] FIG. 3 illustrates an exemplary flow diagram representing steps of a method 300 for translating one or more queries with phrases in multiple languages while retaining a certain portion of the query, in accordance with an embodiment of the present disclosure. The method 300 may be implemented within the system 102, as described in conjunction with FIGs. 1-2. In another embodiment, the method 300 may be implemented within the centralized server 104.
[0072] At block 302, the method 300 may include receiving a query from a user 114. The user 114 may be associated with a computing device 112. Further, the query may include a plurality of phrases in a primary language and a secondary language. Furthermore, the query may be directed towards a purchase of a product by the user 114 on a digital platform.
[0073] At block 304, the method 300 may include training a query translation model 110 with a corpus data and a brand name list based on the primary language and the secondary language.
[0074] At block 306, the method 300 may include analyzing the plurality of phrases in the primary language and the secondary language using the trained query translation model 110.
[0075] At block 308, the method 300 may include, in response to the analyzing, generating a selectively translated query in the primary language while retaining a brand name and a semantic representation of the query in the primary language.
[0076] It may be appreciated that the steps shown in FIG. 3 are merely illustrative. Other suitable steps may be used for the same, if desired. Moreover, the steps of the method 300 may be performed in any order and may include additional steps.
[0077] FIG. 4 illustrates an exemplary computer system 400 in which or with which embodiments of the present disclosure may be utilized. The computer system 400 may be implemented as or within the system 102 described in conjunction with FIGs. 1-2. The computer system 400 may be implemented as or within the centralized server 104. As depicted in FIG. 4, the computer system 400 may include an external storage device 410, a bus 420, a main memory 430, a read-only memory 440, a mass storage device 450, communication port(s) 460, and a processor 470. A person skilled in the art will appreciate that the computer system 400 may include more than one processor 470 and communication ports 460. The processor 470 may include various modules associated with embodiments of the present disclosure. The communication port(s) 460 may be any of an RS-232 port for use with a modem-based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or future ports. The communication port(s) 460 may be chosen depending on a network, such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which the computer system 400 connects.
[0078] In an embodiment, the main memory 430 may be Random Access Memory (RAM), or any other dynamic storage device commonly known in the art. The read-only memory 440 may be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or Basic Input/Output System (BIOS) instructions for the processor 470. The mass storage device 450 may be any current or future mass storage solution, which can be used to store information and/or instructions. Exemplary mass storage solutions include, but are not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial Bus (USB) and/or Firewire interfaces).
[0079] In an embodiment, the bus 420 communicatively couples the processor 470 with the other memory, storage, and communication blocks. The bus 420 may be, e.g. a Peripheral Component Interconnect PCI) / PCI Extended (PCI-X) bus, Small Computer System Interface (SCSI), Universal Serial Bus (USB), or the like, for connecting expansion cards, drives, and other subsystems as well as other buses, such a front side bus (FSB), which connects the processor 470 to the computer system 400.
[0080] In another embodiment, operator and administrative interfaces, e.g. a display, keyboard, and a cursor control device, may also be coupled to the bus 420 to support direct operator interaction with the computer system 400. Other operator and administrative interfaces may be provided through network connections connected through the communication port(s) 460. Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system 400 limit the scope of the present disclosure.
[0081] Thus, it will be appreciated by those of ordinary skill in the art that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named.
[0082] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
, Claims:1. A system (102) for translating one or more queries, the system (102) comprising:
one or more processors (108) operatively coupled with a digital platform; and
a memory (202) coupled with the one or more processors (108), wherein said memory (202) stores instructions which when executed by the one or more processors (108) cause the system (102) to:
receive a query from a user (114) associated with a computing device (112), wherein the query comprises a plurality of phrases in a primary language and a secondary language, the query being directed towards a purchase of a product by the user (114) on the digital platform, and wherein the computing device (112) is operably coupled to the digital platform via a network (106);
train a query translation model (110) with a corpus data and a brand name list based on the primary language and the secondary language;
analyze, using the trained query translation model (110), the plurality of phrases in the primary language and the secondary language; and
in response to the analysis, generate a selectively translated query in the primary language while retaining a brand name and a semantic representation of the query in the primary language.

2. The system (102) as claimed in claim 1, wherein the one or more processors (108) are configured to utilize the brand name list to compute a brand loss and a scaling factor associated with the query translation model (110).

3. The system (102) as claimed in claim 2, wherein the one or more processors (108) are configured to utilize any or a combination of a supervised loss and a data augmentation loss with the computed brand loss and the scaling factor to train the query translation model (110).

4. The system (102) as claimed in claim 2, wherein the brand loss is a cross entropy loss based on a plurality of brand names associated with the digital platform.

5. The system (102) as claimed in claim 1, wherein the semantic representation of the query comprises description of the query using the plurality of phrases in the primary language.

6. The system (102) as claimed in claim 1, wherein the one or more processors (108) are configured to utilize a plurality of transliterations and translations associated with the secondary language to generate the corpus data required to train the query translation model (110).

7. A method (300) for translating one or more queries, the method (300) comprising:
receiving (302), by a system (102), a query from a user (114) associated with a computing device (112), wherein the query comprises a plurality of phrases in a primary language and a secondary language directed towards a purchase of a product by the user (114) on a digital platform;
training (304), by the system (102), a query translation model (110) with a corpus data and a brand name list based on the primary language and the secondary language;
analyzing (306), by the system (102), using the trained query translation model (110), the plurality of phrases in the primary language and the secondary language; and
in response to the analyzing, generating (308), by the system (102), a selectively translated query in the primary language while retaining a brand name and a semantic representation of the query in the primary language.

8. The method (300) as claimed in claim 7, comprising utilizing, by the system (102), the brand name list to compute a brand loss and a scaling factor associated with the query translation model (110).

9. The method (300) as claimed in claim 8, comprising utilizing, by the system (102), any or a combination of a supervised loss and a data augmentation loss with the computed brand loss and the scaling factor to train the query translation model (110).

10. The method (300) as claimed in claim 7, wherein the semantic representation of the query comprises description of the query using the plurality of phrases in the primary language.

11. The method (300) as claimed in claim 8, wherein the brand loss is a cross entropy loss based on a plurality of brand names associated with the digital platform.

12. The method (300) as claimed in claim 7, comprising utilizing, by the system (102), a plurality of transliterations and translations associated with the secondary language to generate the corpus data required to train the query translation model (110).

Documents

Application Documents

# Name Date
1 202341023709-STATEMENT OF UNDERTAKING (FORM 3) [30-03-2023(online)].pdf 2023-03-30
2 202341023709-POWER OF AUTHORITY [30-03-2023(online)].pdf 2023-03-30
3 202341023709-FORM 1 [30-03-2023(online)].pdf 2023-03-30
4 202341023709-DRAWINGS [30-03-2023(online)].pdf 2023-03-30
5 202341023709-DECLARATION OF INVENTORSHIP (FORM 5) [30-03-2023(online)].pdf 2023-03-30
6 202341023709-COMPLETE SPECIFICATION [30-03-2023(online)].pdf 2023-03-30
7 202341023709-ENDORSEMENT BY INVENTORS [08-04-2023(online)].pdf 2023-04-08
8 202341023709-FORM 18 [30-11-2024(online)].pdf 2024-11-30