Sign In to Follow Application
View All Documents & Correspondence

System And Method For Mapping Reviews To Product Images

Abstract: A system and method for mapping one or more reviews to product images in an e-commerce retail environment is presented. The system includes a data module configured to access a product image and an extraction module configured to extract one or more attributes and/or one or more features from the product image. The system further includes a localization module configured to assign a corresponding location to the one or more attributes and/or one or more features on the product image. The system furthermore includes a review selection module configured to select one or more reviews corresponding to at least one attribute of the one or more attributes and/or at least one feature of the one or more features. The system moreover includes a review mapping module configured to display the selected one or more reviews proximate to a location of a corresponding attribute and/or a corresponding feature on the product image.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
21 September 2022
Publication Number
12/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Myntra Designs Private Limited
3rd floor, AKR TECH Park, Krishna Reddy Industrial Area, Muneshwara Nagar, Bangalore – 560068 INDIA

Inventors

1. Lakshya Kumar
Flat No: H-903, Fortune Residency, Raj Nagar Extension, Ghaziabad, Uttar Pradesh 201017 INDIA
2. Sangeet Jaiswal
S-237,2nd Floor, Uppal Southend, Sector- 49, Gurgaon, Haryana 122018 INDIA
3. Sreekanth Vempati
I-G6, Aparna Cyberlife, Besides Citizens Hospital, Nallagandla, Serilingampally, Hyderabad, Telangana - 500019, INDIA
4. Konduru Saiswaroop
42-251/1/a , behind Sai Ratna Hospital, New Town colony, Wanaparthy, Telangana - 509103, INDIA
5. Hrishikesh Vidyadhar Ganu
T3B301, Godrej Woodsman Estate, Hebbal, Bangalore, Karnataka 560024 INDIA

Specification

Description:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003

COMPLETE SPECIFICATION
(Section 10, rule 13)

“SYSTEM AND METHOD FOR MAPPING REVIEWS TO PRODUCT IMAGES”

Myntra Designs Private Limited
3rd Floor, AKR Tech Park, Krishna Reddy Industrial Area, Muneshwara Nagar, Bangalore 560068

The following specification particularly describes the invention and the manner in which it is to be performed


SYSTEM AND METHOD FOR MAPPING REVIEWS TO PRODUCT IMAGES

BACKGROUND
[0001] Embodiments of the present invention generally relate to systems and methods for mapping one or more reviews to product images in an e-commerce retail environment, and more particularly to systems and methods for mapping one or more reviews to product images in an e-commerce retail environment based on one or more attributes and/or features of the product.
[0002] Online shopping (e-commerce) platforms for fashion items, supported in a contemporary internet environment, are well known. Shopping for clothing items online via the internet is growing in popularity because it potentially offers shoppers a broader range of choices of clothing in comparison to earlier off-line boutiques and superstores. In current online shopping environments, shoppers typically refer to reviews of the products while making a purchase. On most online shopping platforms, the reviews are provided in a consolidated format, typically at the bottom of a shopping interface. Thus, there is a need for systems and methods that allow for easy accessibility of reviews for shoppers on online shopping platforms.
SUMMARY
[0003] The following summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, example embodiments, and features described, further aspects, example embodiments, and features will become apparent by reference to the drawings and the following detailed description.
[0004] Briefly, according to an example embodiment, a system for mapping one or more reviews to product images in an e-commerce retail environment is presented. The system includes a data module configured to access a product image and an extraction module configured to extract one or more attributes and/or one or more features from the product image. The system further includes a localization module configured to assign a corresponding location to the one or more attributes and/or one or more features on the product image. The system furthermore includes a review selection module configured to select one or more reviews corresponding to at least one attribute of the one or more attributes and/or at least one feature of the one or more features. The system moreover includes a review mapping module configured to display the selected one or more reviews proximate to a location of a corresponding attribute and/or a corresponding feature on the product image.
[0005] According to another example embodiment, a system for mapping one or more reviews to product images in an e-commerce retail environment is presented. The system includes a data module configured to access a product image. The system further includes an attribute extraction module configured to extract one or more attributes from the product image and an attribute localization module configured to assign a corresponding location to the one or more attributes on the product image. The system further includes a feature extraction module configured to extract one or more features from the product image and a feature localization module configured to assign a corresponding location to the one or more features on the product image. The system furthermore includes a review selection module configured to select one or more reviews corresponding to at least one attribute of the one or more attributes and/or at least one feature of the one or more features. The system moreover includes a review mapping module configured to display the one or more reviews proximate to a location of a corresponding attribute and/or a corresponding feature on the product image.
[0006] According to another example embodiment, a method for mapping one or more reviews to product images in an e-commerce retail environment is presented. The method includes accessing a product image, extracting one or more attributes and/or one or more features from the product image, assigning a corresponding location to the one or more attributes and/or the one or more features on the product image, selecting one or more reviews corresponding to at least one attribute of the one or more attributes and/or at least one feature of the one or more features, and displaying the one or more reviews proximate to a location of a corresponding attribute and/or a corresponding feature on the product image.
BRIEF DESCRIPTION OF THE FIGURES
[0007] These and other features, aspects, and advantages of the example embodiments will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0008] FIG. 1 is a block diagram illustrating an example system for mapping reviews to product images, according to some aspects of the present description,
[0009] FIG. 2 is a block diagram illustrating an example system for mapping reviews to product images, according to some aspects of the present description,
[0010] FIG. 3 illustrates a flow chart for mapping reviews to product images, according to some aspects of the present description, and
[0011] FIG. 4 illustrates an example of one or more reviews mapped to visible attributes on a product image, according to some aspects of the present description,
[0012] FIG. 5 illustrates an example of one or more reviews mapped to visible attributes on a product image, according to some aspects of the present description,
[0013] FIG. 6 illustrates an example of one or more reviews mapped to invisible features on a product image, according to some aspects of the present description,
[0014] FIG. 7 illustrates an example of one or more reviews mapped to visible attributes and invisible features on a product image, according to some aspects of the present description,
[0015] FIG. 8 illustrates an example of one or more reviews mapped to visible attributes and invisible features on a product image, according to some aspects of the present description, and
[0016] FIG. 9 is a block diagram illustrating an example computer system, according to some aspects of the present description.
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0017] Various example embodiments will now be described more fully with reference to the accompanying drawings in which only some example embodiments are shown. Specific structural and functional details disclosed herein are merely representative for purposes of describing example embodiments. Example embodiments, however, may be embodied in many alternate forms and should not be construed as limited to only the example embodiments set forth herein. On the contrary, example embodiments are to cover all modifications, equivalents, and alternatives thereof.
[0018] The drawings are to be regarded as being schematic representations and elements illustrated in the drawings are not necessarily shown to scale. Rather, the various elements are represented such that their function and general purpose become apparent to a person skilled in the art. Any connection or coupling between functional blocks, devices, components, or other physical or functional units shown in the drawings or described herein may also be implemented by an indirect connection or coupling. A coupling between components may also be established over a wireless connection. Functional blocks may be implemented in hardware, firmware, software, or a combination thereof.
[0019] Before discussing example embodiments in more detail, it is noted that some example embodiments are described as processes or methods depicted as flowcharts. Although the flowcharts describe the operations as sequential processes, many of the operations may be performed in parallel, concurrently, or simultaneously. In addition, the order of operations may be re-arranged. The processes may be terminated when their operations are completed, but may also have additional steps not included in the figures. It should also be noted that in some alternative implementations, the functions/acts/steps noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed substantially concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
[0020] Further, although the terms first, second, etc. may be used herein to describe various elements, components, regions, layers and/or sections, it should be understood that these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are used only to distinguish one element, component, region, layer, or section from another region, layer, or a section. Thus, a first element, component, region, layer, or section discussed below could be termed a second element, component, region, layer, or section without departing from the scope of example embodiments.
[0021] Spatial and functional relationships between elements (for example, between modules) are described using various terms, including “connected,” “engaged,” “interfaced,” and “coupled.” Unless explicitly described as being “direct,” when a relationship between first and second elements is described in the description below, that relationship encompasses a direct relationship where no other intervening elements are present between the first and second elements, and also an indirect relationship where one or more intervening elements are present (either spatially or functionally) between the first and second elements. In contrast, when an element is referred to as being "directly” connected, engaged, interfaced, or coupled to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., "between," versus "directly between," "adjacent," versus "directly adjacent," etc.).
[0022] The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0023] As used herein, the singular forms “a,” “an,” and “the,” are intended to include the plural forms as well, unless the context clearly indicates otherwise. As used herein, the terms “and/or” and “at least one of” include any and all combinations of one or more of the associated listed items. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0024] Unless specifically stated otherwise, or as is apparent from the description, terms such as “processing” or “computing” or “calculating” or “determining” of “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device/hardware, that manipulates and transforms data represented as physical, electronic quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0025] Example embodiments of the present description provide systems and methods for mapping one or more reviews to product images in an e-commerce retail environment. Some embodiments of the present description provide systems and methods for mapping one or more reviews to product images in an e-commerce retail environment based on one or more attributes and/or features of the product.
[0026] FIG. 1 illustrates an example system 100 for mapping one or more reviews to product images in an e-commerce retail environment. The product may be selected from fashion products, electronic products, household items, furniture items, decorative items, linen, furnishing (carpets, cushions, curtains), lamps, tableware, and the like. In one embodiment, the product is a fashion product. Non-limiting examples of fashion products include garments (such as top wear, bottom wear, and the like), accessories (such as scarves, belts, socks, sunglasses, bags, watches), jewelry, footwear, and the like. For the purpose of this description, the following embodiments are described with respect to an online fashion retail platform. However, it must be understood that embodiments described herein can be implemented on any e-commerce platform having a portfolio of retail items/products.
[0027] The system 100 includes a data module 102, an extraction module 104, a localization module 106, a review selection module 108, and a review mapping module 110. Each of these components is described in detail below.
[0028] The data module 102 is configured to access a product image. The data module 102 may be configured to access the product image from an image database or a product catalog in the e-commerce retail environment. The product image may be a standalone image of the product in one embodiment. The term “standalone image” as used herein refers to the image of the product by itself and does not include a model or a mannequin. In certain embodiments, the product image may be a flat shot image of the product. The flat shot images may be taken from any suitable angle and include top-views, side views, front-views, back-views, and the like. In another embodiment, the product image may be an image of a human model or a mannequin wearing the product taken from any suitable angle. The data module 102 is further communicatively coupled to an extraction module 104 as shown in FIG. 1.
[0029] The extraction module 104 is configured to extract one or more attributes and/or one or more features corresponding to the product from the product image. The term “attributes” as used herein refers to one or more visible characteristics of the product in the product image. By way of example, for a garment, attributes may include color, neck type, sleeve length, garment shape, and the like. Similarly, for an electronic product, attributes may include, for example, color, number of cameras, display type, product shape, and the like. The term “features” as used herein refers to one or more invisible characteristics of the product in the product image. By way of example, for a garment, features may include material, size, fit, style, and the like. Similarly, for an electronic product, features may include, for example, processing performance, battery life, water durability, and the like.
[0030] In some embodiments, the extraction module 104 may be configured to extract the one or more attributes from the product image based on a trained extraction model. The extraction module 104 may be configured to classify the product image into different attributes. The extraction module 104 may be further configured to define one or more boundary values to the one or more attributes indicating a position or a location of the one or more attributes on the product image.
[0031] In some embodiments, the extraction module 104 may be configured to extract the one or more features from the product image based on a trained extraction model. The extraction module 104 may be configured to extract the one or more features based on pre-defined features for each product.
[0032] Non-limiting examples of extraction models include convolution neural networks. In some embodiments, the extraction module 104 is configured to extract the one or more attributes from the product image based on a trained faster regional-convolutional neural network (Faster R-CNN). In some embodiments, the extraction module 104 is configured to extract the one or more features from the product image based on a trained regional-convolutional neural network (R-CNN). For extracting the one or more attributes, the extraction model may be trained based on a training dataset comprised of a plurality of product images and one or more corresponding attributes labeled on the plurality of product images. For extracting the one or more features, the extraction model may be trained based on a training dataset comprised of a plurality of product images and one or more features pre-defined for products corresponding to the plurality of product images.
[0033] Referring again to FIG. 1, the localization module 106 is communicatively coupled with the extraction module 104 and is configured to assign a corresponding location to the one or more attributes and/or one or more features on the product image. In some embodiments, the localization module 106 is configured to assign a corresponding location to the one or more attributes based on one or more boundary values defined by the trained extraction model. In some embodiments, the localization module 106 is configured to assign a corresponding location to the one or more features based on pre-determined coordinates on the product image.
[0034] The review selection module 108 is communicatively coupled with the extraction module 104 and is configured to select one or more reviews corresponding to at least one attribute of the one or more attributes and/or at least one feature of the one or more features. The review selection module 108 may be communicatively coupled with a review database and configured to select one or more reviews from the review database.
[0035] The review mapping module 110 is configured to display the selected one or more reviews proximate to a location of a corresponding attribute and/or a corresponding feature on the product image. In some embodiments, the review mapping module 110 is further configured to mark the location of the one or more reviews on the product image using a visual marker (e.g., a visible dot) on the product image. The review mapping module 110 may be configured to display the selected one or more reviews as a popup that can be scrolled down to read the one or more reviews.
[0036] In some embodiments, the review mapping system 100 further includes a review classification module 112 configured to classify a review based on the one or more attributes and/or the one or more features, and the review selection module 108 is configured to select a review as one of the one or more reviews based one or more classes assigned to the review. In some embodiments, the review classification module 112 is configured to classify a review based on a transformer-based model.
[0037] In some embodiments, the review selection module 108 is configured to first select a review as one of the one or more reviews if the review explicitly mentions an attribute and/or a feature. The review selection module 108 may be further configured to select a review as one of the one or more reviews if the review is semantically related to an attribute and/or a feature.
[0038] In some embodiments, the review selection module 108 is further configured to rank the one or more reviews based on a language score assigned to each review of the one or more reviews. The review selection module 108 may be further configured to assign a language score to a review based on a trained language model, for example, a generative pre-trained transformer (GPT) model. The review mapping module 110 is configured to display the one or more reviews based on their ranking.
[0039] FIG. 2 illustrates an example system 100 for mapping one or more reviews to product images in an e-commerce retail environment based on one or more attributes and one or more features. The system includes a data module 102, an attribute extraction module 103, an attribute localization module 105, a feature extraction module 107, a feature localization module 109, a review selection module 108, and a review mapping module 110.
[0040] The data module 102 is configured to access a product image. The data module 102 may be configured to access the product image from an image database or a product catalog in the e-commerce retail environment.
[0041] The attribute extraction module 103 is configured to extract one or more attributes from the product image. The term “attribute” is as defined herein earlier. The attribute extraction module 103 may be configured to classify the product image into different attributes. The attribute extraction module 103 may be further configured to define one or more boundary values to the one or more attributes indicating a position or a location of the one or more attributes on the product image.
[0042] In some embodiments, the attribute extraction module 103 is configured to extract the one or more attributes from the product image based on a trained attribute extraction model, for example, a trained faster regional-convolutional neural network (Faster R-CNN). In some embodiments, the attribute extraction module 103 is configured to extract the one or more features from the product image based on a trained regional-convolutional neural network (R-CNN). For extracting the one or more attributes, the extraction model may be trained based on a training dataset comprised of a plurality of product images and one or more corresponding attributes labeled on the plurality of product images.
[0043] The attribute localization module 105 is configured to assign a corresponding location to the one or more attributes on the product image. In some embodiments, the attribute localization module 105 is configured to assign a corresponding location to the one or more attributes based on one or more boundary values defined by the trained attribute extraction model.
[0044] The feature extraction module 107 is configured to extract one or more features from the product image. The term “feature” is as defined herein earlier. In some embodiments, the feature extraction module 107 is configured to extract the one or more features from the product image based on a trained feature extraction model. In some embodiments, the feature extraction module 107 is configured to extract the one or more features from the product image based on a trained regional-convolutional neural network (R-CNN). For extracting the one or more features, the extraction model may be trained based on a training dataset comprised of a plurality of product images and one or more features pre-defined for the plurality of product images.
[0045] The feature localization module 109 is configured to assign a corresponding location to the one or more features on the product image. In some embodiments, the feature localization module 109 is configured to assign a corresponding location to the one or more attributes based on pre-determined coordinates on the product image.
[0046] The review selection module 108 is communicatively coupled with the attribute extraction module 103 and the feature extraction module 107 and is configured to select one or more reviews corresponding to at least one attribute of the one or more attributes and/or at least one feature of the one or more features. The review selection module 108 may be communicatively coupled with a review database and configured to select one or more reviews from the review database.
[0047] The review mapping module 110 is configured to display the selected one or more reviews proximate to a location of a corresponding attribute and/or a corresponding feature on the product image. In some embodiments, the review mapping module 110 is further configured to mark a location of the one or more reviews on the product image using a visual marker (e.g., a visible dot) on the product image. The review mapping module 110 may be configured to display the selected one or more reviews as a popup that can be scrolled down to read the one or more reviews.
[0048] In some embodiments, the review selection module 108 is configured to first select a review as one of the one or more reviews if the review explicitly mentions an attribute and/or a feature. The review selection module 108 may be further configured to select a review as one of the one or more reviews if the review is semantically related to an attribute and/or a feature.
[0049] In some embodiments, the review selection module 108 is further configured to rank the one or more reviews based on a language score assigned to each review of the one or more reviews. The review selection module 108 may be further configured to assign a language score to a review based on a trained language model, for example, a generative pre-trained transformer (GPT) model. The review mapping module 110 is configured to display the one or more reviews based on their ranking.
[0050] The manner of implementation of the review mapping system 100 is described below in FIG. 6. FIG. 6 is a flowchart illustrating a method 200 for mapping one or more reviews to product images in an e-commerce retail environment. The method 200 may be implemented using the systems of FIGs. 1 and 2, according to some aspects of the present description. Each step of the method 200 is described in detail below.
[0051] The method 200 includes, at step 202, accessing a product image. The method 200 may include accessing the product image from an image database or a product catalog in the e-commerce retail environment.
[0052] At step 204, the method 200 include extracting one or more attributes and/or one or more features from the product image. In some embodiments, the method 200 includes extracting the one or more attributes and/or one or more features from the product image based on a trained extraction model. The method 200 may further include, at step 204, classifying the product image into different attributes. The method 200 may further include, at step 204, defining one or more boundary values to the one or more attributes indicating a position or a location of the one or more attributes on the product image. The method 200 may include, at step 204, extracting the one or more features based on pre-defined features for each product.
[0053] Non-limiting examples of extraction models include convolution neural networks. In some embodiments, the method 200 includes extracting the one or more attributes from the product image based on a trained faster regional-convolutional neural network (Faster R-CNN). In some embodiments, the method 200 includes extracting the one or more features from the product image based on a trained regional-convolutional neural network (R-CNN). For extracting the one or more attributes, the extraction model may be trained based on a training dataset comprised of a plurality of product images and one or more corresponding attributes labeled on the plurality of product images. For extracting the one or more features, the extraction model may be trained based on a training dataset comprised of a plurality of product images and one or more features pre-defined for the plurality of product images.
[0054] The method 200 further includes, at step 206, assigning a corresponding location to the one or more attributes and/or the one or more features on the product image. In some embodiments, the method 200 includes assigning a corresponding location to the one or more attributes based on one or more boundary values defined by the trained extraction model and/or assigning a location to the one or more features based on pre-determined coordinates on the product image.
[0055] At step 208, the method 200 includes selecting one or more reviews corresponding to at least one attribute of the one or more attributes and/or at least one feature of the one or more features. The method 200, further includes at step 210, displaying the one or more reviews proximate to a location of a corresponding attribute and/or a corresponding feature on the product image. The method 200, further includes at step 210. marking a location of the one or more review on the product image using a visual marker on the product image.
[0056] In some embodiments, the method 200 further includes classifying a review based on the one or more attributes and/or the one or more features, and selecting a review as one of the one or more reviews based one or more classes assigned to the review. The method 200 further includes, at step 208, selecting a review as one of the one or more reviews if the review explicitly mentions an attribute and/or a feature and/or if the review is semantically related to an attribute and/or a feature. The method 200 further includes, at step 208, ranking the one or more reviews based on a language score assigned to each review of the one or more reviews, and displaying the one or more reviews based on their ranking.
[0057] FIGs. 4 and 5 illustrate examples of reviews mapped onto a product corresponding to visible attributes on the product image, according to embodiments of the present description. FIGs. 6 illustrate an example of reviews mapped onto a product corresponding to invisible features on the product image, according to embodiments of the present description. FIGs. 7 and 8 illustrate examples of reviews mapped onto a product corresponding to both visible attributes and invisible features on the product image, according to embodiments of the present description.
[0058] The systems and methods described herein may be partially or fully implemented by a special purpose computer system created by configuring a general-purpose computer to execute one or more particular functions embodied in computer programs. The functional blocks and flowchart elements described above serve as software specifications, which may be translated into the computer programs by the routine work of a skilled technician or programmer.
[0059] The computer programs include processor-executable instructions that are stored on at least one non-transitory computer-readable medium, such that when run on a computing device, cause the computing device to perform any one of the aforementioned methods. The medium also includes, alone or in combination with the program instructions, data files, data structures, and the like. Non-limiting examples of the non-transitory computer-readable medium include, but are not limited to, rewriteable non-volatile memory devices (including, for example, flash memory devices, erasable programmable read-only memory devices, or a mask read-only memory devices), volatile memory devices (including, for example, static random access memory devices or a dynamic random access memory devices), magnetic storage media (including, for example, an analog or digital magnetic tape or a hard disk drive), and optical storage media (including, for example, a CD, a DVD, or a Blu-ray Disc). Examples of the media with a built-in rewriteable non-volatile memory, include but are not limited to memory cards, and media with a built-in ROM, including but not limited to ROM cassettes, etc. Program instructions include both machine codes, such as produced by a compiler, and higher-level codes that may be executed by the computer using an interpreter. The described hardware devices may be configured to execute one or more software modules to perform the operations of the above-described example embodiments of the description, or vice versa.
[0060] Non-limiting examples of computing devices include a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable array (FPA), a programmable logic unit (PLU), a microprocessor or any device which may execute instructions and respond. A central processing unit may implement an operating system (OS) or one or more software applications running on the OS. Further, the processing unit may access, store, manipulate, process and generate data in response to the execution of software. It will be understood by those skilled in the art that although a single processing unit may be illustrated for convenience of understanding, the processing unit may include a plurality of processing elements and/or a plurality of types of processing elements. For example, the central processing unit may include a plurality of processors or one processor and one controller. Also, the processing unit may have a different processing configuration, such as a parallel processor.
[0061] The computer programs may also include or rely on stored data. The computer programs may encompass a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services, background applications, etc.
[0062] The computer programs may include: (i) descriptive text to be parsed, such as HTML (hypertext markup language) or XML (extensible markup language), (ii) assembly code, (iii) object code generated from source code by a compiler, (iv) source code for execution by an interpreter, (v) source code for compilation and execution by a just-in-time compiler, etc. As examples only, source code may be written using syntax from languages including C, C++, C#, Objective C, Haskell, Go, SQL, R, Lisp, Java®, Fortran, Perl, Pascal, Curl, OCaml, Javascript®, HTML5, Ada, ASP (active server pages), PHP, Scala, Eiffel, Smalltalk, Erlang, Ruby, Flash®, Visual Basic®, Lua, and Python®.
[0063] One example of a computing system 300 is described below in FIG. 9. The computing system 300 includes one or more processor 302, one or more computer-readable RAMs 304 and one or more computer-readable ROMs 306 on one or more buses 308. Further, the computer system 308 includes a tangible storage device 310 that may be used to execute operating systems 320 and review mapping system 100. Both, the operating system 320 and the review mapping system 100 are executed by processor 302 via one or more respective RAMs 303 (which typically includes cache memory). The execution of the operating system 320 and/or review mapping system 100 by the processor 302, configures the processor 302 as a special-purpose processor configured to carry out the functionalities of the operating system 320 and/or the review mapping system 100, as described above.
[0064] Examples of storage devices 310 include semiconductor storage devices such as ROM 503, EPROM, flash memory or any other computer-readable tangible storage device that may store a computer program and digital information.
[0065] Computing system 300 also includes a R/W drive or interface 312 to read from and write to one or more portable computer-readable tangible storage devices 326 such as a CD-ROM, DVD, memory stick or semiconductor storage device. Further, network adapters or interfaces 314 such as a TCP/IP adapter cards, wireless Wi-Fi interface cards, or 3G or 3G wireless interface cards or other wired or wireless communication links are also included in the computing system 300.
[0066] In one example embodiment, the review mapping system 100 may be stored in tangible storage device 310 and may be downloaded from an external computer via a network (for example, the Internet, a local area network, or another wide area network) and network adapter or interface 314.
[0067] Computing system 300 further includes device drivers 316 to interface with input and output devices. The input and output devices may include a computer display monitor 318, a keyboard 322, a keypad, a touch screen, a computer mouse 324, and/or some other suitable input device.
[0068] In this description, including the definitions mentioned earlier, the term ‘module’ may be replaced with the term ‘circuit.’ The term ‘module’ may refer to, be part of, or include processor hardware (shared, dedicated, or group) that executes code and memory hardware (shared, dedicated, or group) that stores code executed by the processor hardware. The term code, as used above, may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects.
[0069] Shared processor hardware encompasses a single microprocessor that executes some or all code from multiple modules. Group processor hardware encompasses a microprocessor that, in combination with additional microprocessors, executes some or all code from one or more modules. References to multiple microprocessors encompass multiple microprocessors on discrete dies, multiple microprocessors on a single die, multiple cores of a single microprocessor, multiple threads of a single microprocessor, or a combination of the above. Shared memory hardware encompasses a single memory device that stores some or all code from multiple modules. Group memory hardware encompasses a memory device that, in combination with other memory devices, stores some or all code from one or more modules.
[0070] In some embodiments, the module may include one or more interface circuits. In some examples, the interface circuits may include wired or wireless interfaces that are connected to a local area network (LAN), the Internet, a wide area network (WAN), or combinations thereof. The functionality of any given module of the present description may be distributed among multiple modules that are connected via interface circuits. For example, multiple modules may allow load balancing. In a further example, a server (also known as remote, or cloud) module may accomplish some functionality on behalf of a client module.
[0071] While only certain features of several embodiments have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the invention and the appended claims.
, Claims:We CLAIM

1. A system for mapping one or more reviews to product images in an e-commerce retail environment, the system comprising:
a data module configured to access a product image;
an extraction module configured to extract one or more attributes and/or one or more features from the product image;
a localization module configured to assign a corresponding location to the one or more attributes and/or one or more features on the product image;
a review selection module configured to select one or more reviews corresponding to at least one attribute of the one or more attributes and/or at least one feature of the one or more features; and
a review mapping module configured to display the selected one or more reviews proximate to a location of a corresponding attribute and/or a corresponding feature on the product image.

2. The system of claim 1, wherein the extraction module is configured to extract the one or more attributes from the product image based on a trained extraction model.

3. The system of claim 2, wherein the localization module is configured to assign a corresponding location to the one or more attributes based on one or more boundary values defined by the trained extraction model and to assign a corresponding location to the one or more attributes based on pre-determined coordinates on the product image.

4. The system of claim 1, further comprising a review classification module configured to classify a review based on the one or more attributes and/or the one or more features, wherein the review selection module is configured to select a review as one of the one or more reviews based one or more classes assigned to the review.

5. The system of claim 1, wherein the review selection module is configured to select a review as one of the one or more reviews if the review explicitly mentions an attribute and/or a feature and/or if the review is semantically related to an attribute and/or a feature.

6. The system of claim 1, wherein the review selection module is further configured to rank the one or more reviews based on a language score assigned to each review of the one or more reviews, and the review mapping module is configured to display the one or more reviews based on their ranking.

7. The system of claim 1, wherein the review mapping feature is further configured to mark a location of the one or more review on the product image using a visual marker on the product image.

8. A system for mapping one or more reviews to product images in an e-commerce retail environment, the system comprising:
a data module configured to access a product image;
an attribute extraction module configured to extract one or more attributes from the product image;
an attribute localization module configured to assign a corresponding location to the one or more attributes on the product image;
a feature extraction module configured to extract one or more features from the product image;
a feature localization module configured to assign a corresponding location to the one or more features on the product image;
a review selection module configured to select one or more reviews corresponding to at least one attribute of the one or more attributes and/or at least one feature of the one or more features; and
a review mapping module configured to display the one or more reviews proximate to a location of a corresponding attribute and/or a corresponding feature on the product image.

9. The system of claim 8, wherein the attribute extraction module is configured to extract the one or more attributes from the product image based on a trained attribute extraction model, and the attribute localization module is configured to assign a corresponding location to the one or more attributes based on one or more boundary values defined by the trained attribute extraction model.

10. The system of claim 8, wherein the feature extraction module is configured to extract the one or more features from the product image based on a trained feature extraction model, and the feature localization module is configured to assign a corresponding location to the one or more attributes based on pre-determined coordinates on the product image.

11. The system of claim 8, wherein the review selection module is configured to select a review as one of the one or more reviews if the review explicitly mentions an attribute and/or a feature and/or if the review is semantically related to an attribute and/or a feature.

12. The system of claim 8, wherein the review selection module is further configured to rank the one or more reviews based on a language score assigned to each review of the one or more reviews, and the review mapping module is configured to display the one or more reviews based on their ranking.

13. The system of claim 8, wherein the review mapping feature is further configured to mark a location of the one or more reviews on the product image using a visual marker on the product image.

14. A method for mapping one or more reviews to product images in an e-commerce retail environment, the method comprising:
accessing a product image;
extracting one or more attributes and/or one or more features from the product image;
assigning a corresponding location to the one or more attributes and/or the one or more features on the product image;
selecting one or more reviews corresponding to at least one attribute of the one or more attributes and/or at least one feature of the one or more features; and
displaying the one or more reviews proximate to a location of a corresponding attribute and/or a corresponding feature on the product image.

15. The method of claim 14, comprising extracting the one or more attributes and/or one or more features from the product image based on a trained extraction model.

16. The method of claim 15, comprising assigning a corresponding location to the one or more attributes based on one or more boundary values defined by the trained extraction model and/or assigned a location to the one or more features based on pre-determined coordinates on the product image.

17. The method of claim 14, further comprising classifying a review based on the one or more attributes and/or the one or more features, and selecting a review as one of the one or more reviews based one or more classes assigned to the review.

18. The method of claim 14, further comprising selecting a review as one of the one or more reviews if the review explicitly mentions an attribute and/or a feature and/or if the review is semantically related to an attribute and/or a feature.

19. The method of claim 14, further comprising ranking the one or more reviews based on a language score assigned to each review of the one or more reviews, and displaying the one or more reviews based on their ranking.

20. The method of claim 14, further comprising marking a location of the one or more reviews on the product image using a visual marker on the product image.

Documents

Application Documents

# Name Date
1 202241054130-STATEMENT OF UNDERTAKING (FORM 3) [21-09-2022(online)].pdf 2022-09-21
2 202241054130-REQUEST FOR EXAMINATION (FORM-18) [21-09-2022(online)].pdf 2022-09-21
3 202241054130-POWER OF AUTHORITY [21-09-2022(online)].pdf 2022-09-21
4 202241054130-FORM 18 [21-09-2022(online)].pdf 2022-09-21
5 202241054130-FORM 1 [21-09-2022(online)].pdf 2022-09-21
6 202241054130-DRAWINGS [21-09-2022(online)].pdf 2022-09-21
7 202241054130-DECLARATION OF INVENTORSHIP (FORM 5) [21-09-2022(online)].pdf 2022-09-21
8 202241054130-COMPLETE SPECIFICATION [21-09-2022(online)].pdf 2022-09-21
9 202241054130-FER.pdf 2025-06-06
10 202241054130-RELEVANT DOCUMENTS [07-10-2025(online)].pdf 2025-10-07
11 202241054130-RELEVANT DOCUMENTS [07-10-2025(online)]-1.pdf 2025-10-07
12 202241054130-POA [07-10-2025(online)].pdf 2025-10-07
13 202241054130-POA [07-10-2025(online)]-1.pdf 2025-10-07
14 202241054130-FORM 13 [07-10-2025(online)].pdf 2025-10-07
15 202241054130-FORM 13 [07-10-2025(online)]-1.pdf 2025-10-07

Search Strategy

1 Search_HistoryE_05-12-2024.pdf