Sign In to Follow Application
View All Documents & Correspondence

System And Method To Facilitate Users To Try On Garments Virtually

Abstract: A system and method to facilitate a user to try on, virtually, a garment is disclosed, comprising a computing device to capture an image and/or video of a user body and a processing unit to identify body points associated with the user body based on the captured image and/or video. The processing unit is configured to select an image of the garment from garment images stored in a database on receipt of a selection input from the user, match garment points of the garment with the corresponding identified body points to combine the garment image onto the image of the user body and further generate a combined user-garment image of the selected garment image combined with the image of the user body. Further, the combined user-garment image is displayed on the computing device to enable the user to visualize how the selected garment will look when worn by the user.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
26 June 2021
Publication Number
52/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
info@khuranaandkhurana.com
Parent Application

Applicants

ARORA, Sanjeev
C 1602, Golden Willows, Vasant Garden, Mulund West, Mumbai - 400080, Maharashtra, India.

Inventors

1. ARORA, Sanjeev
C 1602, Golden Willows, Vasant Garden, Mulund West, Mumbai - 400080, Maharashtra, India.
2. SATAM, Kapil
C504 / M6, Pariwar Co Op Hos Soc Ltd, Pratiksha Nagar, Sion, Mumbai - 400022, Maharashtra, India.
3. LAMBHATE, Kishore
KL5/42/6, Sector No.3E, Kalamboli, Navi Mumbai - 410218, Maharashtra, India.

Specification

Claims:1. A method to facilitate a user to try on a garment virtually, the method comprising:
capturing, through a computing device associated with a camera, at least one of an image and video of at least a portion of a body of a user;
identifying, through one or more processors, a plurality of body points associated with the body of the user based on the captured at least one of the image and video;
selecting, through the one or more processors, an image of the garment from a database comprising a plurality of garment images on receipt of a selection input from the user through the computing device;
matching, through the one or more processors, one or more garment points of the garment with corresponding one or more of the identified plurality of body points;
combining, through the one or more processors, the selected image of the garment onto the image of the at least a portion of the body of the user or onto a virtual three-dimensional model of the at least a portion of the body of the user;
generating, through the one or more processors, a combined user-garment image of the selected image of the garment combined with the image of the at least a portion of the body; and
displaying, through the computing device, the combined user-garment image to enable the user to visualize how the garment will look when worn by the user.

2. The method as claimed in claim 1, wherein the method comprises:
generating, through the one or more processors, a combined user-garment video of the selected image of the garment combined with the virtual three-dimensional model of the at least a portion of the body; and
displaying, through the computing device, the combined user-garment video to enable the user to visualize how the garment will look when worn by the user.

3. The method as claimed in claim 1, wherein the method comprises resizing, based on the identified plurality of body points, the selected image of the garment for defining a shape of the garment such that the garment fits onto the body of the user.

4. The method as claimed in claim 1, wherein the three-dimensional model of the at least a portion of the body is created based on the identified plurality of body points.

5. The method as claimed in claim 1, wherein the method comprises computing a height and a width of the at least a portion of the body of the user based on the identified plurality of body points, and wherein the identified plurality of body points are associated with two or more of a center hip, a spine, a center shoulder, head, a left shoulder, a left elbow, a left wrist, a left hand, a right shoulder, a right elbow, a right wrist, a right hand, a left hip, a left knee, a left ankle, a left foot, a right hip, a right knee, a right ankle, and a right foot.

6. The method as claimed in claim 1, wherein the computing device is any of a computer system, a smartphone, a tablet and a laptop.

7. The method as claimed in claim 1, wherein the method comprises:
selecting and displaying, through the computing device, another image of another garment from the database comprising the plurality of garment images on detection of movement of a right hand of the user; and
selecting and displaying, through the computing device, an image of a previously displayed garment from the database comprising the plurality of garment images on detection of movement of a left hand of the user.
8. A system comprising:
a computing device associated with a camera configured to capture at least one of an image and video of at least a portion of a body of a user; and
a processing unit operatively coupled to the computing device, the processing unit comprising one or more processors and a memory coupled to the one or more processors, the memory comprising instruction which when executed by the one or more processors causes the one or more processors to:
identify a plurality of body points associated with the body of the user based on the captured at least one of the image and video;
select an image of the garment from a database comprising a plurality of garment images on receipt of a selection input from the user through the computing device;
match one or more garment points of the garment with corresponding one or more of the identified plurality of body points;
combine the selected image of the garment onto the image of the at least a portion of the body of the user or onto a virtual three-dimensional model of the at least a portion of the body of the user;
generate a combined user-garment image of the selected image of the garment combined with the image of the at least a portion of the body; and
display, through the computing device, the combined user-garment image to enable the user to visualize how the garment will look when worn by the user.

9. The system as claimed in claim 8, wherein the processing unit is configured to generate a combined user-garment video of the selected image of the garment combined with the virtual three-dimensional model of the at least a portion of the body; and wherein the processing unit is configured to display, through the computing device, the combined user-garment video to enable the user to visualize how the garment will look when worn by the user.
10. The system as claimed in claim 8, wherein the processing unit is configured to resize, based on the identified plurality of body points, the selected image of the garment for defining a shape of the garment such that the garment fits onto the body of the user.
, Description:TECHNICAL FIELD
[0001] The present disclosure relates to garments. In particular, the present disclosure relates to a system and method to facilitate a user to try on, virtually, garments before purchasing to determine how the garments will look when actually applied to the body of the user.

BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Use of computer systems and computing devices, such as smartphones, tablet etc. continues to increase at a rapid pace for shopping articles online. Online shopping has enabled customers to be able to browse huge inventories of products without leaving the comfort of their own home. While the online shopping experience has allowed customers to seamlessly analyze articles such as clothing etc. presents additional challenges. For example, during online purchasing of garments, users can specify a size, but they are unable to try the clothes on without first taking delivery of the clothes. Determination of whether a garment fits and/or suits on the user/purchaser, often, cannot be made with regards to just the displayed or stated size of the garments.
[0004] Therefore, people often wish to be able to try on the garments before purchasing the garments to determine the suitability of fit and how it appears. At the present time, to try the garments on, the user must either go to the clothing shops as the typical clothing shops provide dressing rooms, mirrors, and other services to help the user/customer select apparel items to purchase or must wait for the garments to be delivered in case of online shopping, both of which take time and entail travel or delivery costs. In case of online shopping, if the garments do not provide the desired look, they are returned to the online merchant. Thus, it would be helpful if the user could try the clothes on in some way without having to travel to a shop or having to wait to take delivery of the clothes.
[0005] There is, therefore, a need to provide an easy to use, efficient, and cost-effective system and method to facilitate a user to try on, virtually, a garment to determine how the garment will look when actually applied to the body of the user.

OBJECTS OF THE PRESENT DISCLOSURE
[0006] A general object of the present disclosure is to provide an efficient and cost-effective solution of the above-mentioned problems.
[0007] An object of the present disclosure is to provide a system and method for visually representing to a user appearance of one or more garments if worn by that user based on physical attributes of the user.
[0008] Another object of the present disclosure is to a simple and cost-effective system and method to facilitate a user to try on, virtually, a garment before purchasing to determine how the garment will look when actually applied to the body of the user.
[0009] Another object of the present disclosure is to provide an easy to use and efficient system and method which allow an individual to digitally try out different apparels, based on body parameters of the individual, to estimate how well the apparels suits that individual before the individual purchases the apparels.

SUMMARY
[00010] Aspects of the present disclosure relate to garments. In particular, the present disclosure relates to a system and method to facilitate a user to try on, virtually, garments before purchasing to determine how the garments will look when actually applied to the body of the user.
[00011] In an aspect, the proposed method can include steps of capturing, through a computing device associated with a camera, at least one of an image and video of at least a portion of a body of a user and identifying, through one or more processors, a plurality of body points associated with the body of the user based on the captured at least one of the image and video. The proposed method can further include steps of selecting, through the one or more processors, an image of a garment from a database comprising a plurality of garment images on receipt of a selection input from the user through the computing device and matching, through the one or more processors, one or more garment points of the selected garment with corresponding one or more of the identified plurality of body points to combine the selected image of the garment onto the image of the at least a portion of the body of the user or onto a virtual three-dimensional model of the at least a portion of the body of the user.
[00012] In an aspect, the proposed method can further include a step of generating, through the one or more processors, a combined user-garment image of the selected image of the garment combined with the image of the at least a portion of the body for displaying, through the computing device, the combined user-garment image to enable the user to visualize how the selected garment will look when worn by the user.
[00013] In an embodiment, the method can include a step of generating, through the one or more processors, a combined user-garment video of the selected image of the garment combined with the virtual three-dimensional model of the at least a portion of the body for displaying, through the computing device, the combined user-garment video to enable the user to visualize how the selected garment will look when worn by the user.
[00014] In an embodiment, the method can include a step of resizing, based on the identified plurality of body points, the selected image of the garment for defining a shape of the selected garment such that the selected garment fits onto the body of the user.
[00015] In an embodiment, the three-dimensional model of the at least a portion of the body can be created based on the identified plurality of body points.
[00016] In an embodiment, the method can include a step of computing a height and a width of at least a portion of the body of a user based on the identified plurality of body points. The identified plurality of body points can be associated with two or more of a center hip, a spine, a center shoulder, head, a left shoulder, a left elbow, a left wrist, a left hand, a right shoulder, a right elbow, a right wrist, a right hand, a left hip, a left knee, a left ankle, a left foot, a right hip, a right knee, a right ankle, a right foot and the like.
[00017] In an embodiment, the method can include a step of selecting and displaying, through the computing device, another image of another garment from the database comprising the plurality of garment images on detection of movement of a right hand of the user, and selecting and displaying, through the computing device, an image of a previously displayed garment from the database comprising the plurality of garment images on detection of movement of a left hand of the user.
[00018] In an embodiment, the computing device can be any of a computer system, a smartphone, a tablet, a laptop and the like.
[00019] In another aspect of the preset disclosure, the proposed system can include a computing device associated with a camera configured to capture at least one of an image and video of at least a portion of a body of a user and a processing unit operatively coupled to the computing device. The processing unit can include one or more processors and a memory coupled to the one or more processors, the memory comprising instruction which when executed by the one or more processors causes the one or more processors to identify a plurality of body points associated with the body of the user based on the captured at least one of the image and video. The processing can be further configured to select an image of a garment from a database comprising a plurality of garment images on receipt of a selection input from the computing device and match one or more garment points of the selected garment with corresponding one or more of the identified plurality of body points to combine the selected image of the garment onto the image of the at least a portion of the body of the user or onto a virtual three-dimensional model of the at least a portion of the body of the user in order.
[00020] In an embodiment, the processing unit can be further configured to generate a combined user-garment image of the selected image of the garment combined with the image of the at least a portion of the body for displaying, on the computing device, the combined user-garment image to enable the user to visualize how the selected garment will look when worn by the user.
[00021] In another embodiment, the processing unit can be configured to generate a combined user-garment video of the selected image of the garment combined with the virtual three-dimensional model of the at least a portion of the body for displaying the combined user-garment video to enable the user to visualize how the selected garment will look when worn by the user.
[00022] In another embodiment, the processing unit can be configured to resize, based on the identified plurality of body points, the selected image of the garment for defining a shape of the selected garment such that the selected garment fits onto the body of the user.
[00023] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF DRAWINGS
[00024] The accompanying drawings are included to provide a further understanding of the present disclosure and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure. The diagrams are for illustration only, which thus is not a limitation of the present disclosure.
[00025] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[00026] FIG. 1 illustrates a network implementation of the proposed system that facilitates a user to try on, virtually, one or more garments before purchasing to determine how the garments will look when actually applied to the body of the user, in accordance with an embodiment of the present disclosure.
[00027] FIG. 2 illustrates exemplary functional components of the proposed system, in accordance with an embodiment of the present disclosure.
[00028] FIG. 3 illustrates exemplary garment extraction for storage in a database, in accordance with an embodiment of the present disclosure.
[00029] FIG. 4 illustrates computing device showing a combined user-garment image to enable a user to visualize how a garment will look when worn by the user, in accordance with an embodiment of the present disclosure.
[00030] FIG. 5 illustrates an exemplary flow diagram of working of the proposed system, in accordance with an embodiment of the present disclosure.
[00031] FIG. 6 illustrates an exemplary block diagram of the proposed method, in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[00032] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[00033] Embodiments of the present invention include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, firmware and/or by human operators.
[00034] Embodiments of the present invention may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).
[00035] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present invention with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present invention may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the invention could be accomplished by modules, routines, subroutines, or subparts of a computer program product.
[00036] The use of any and all examples, or exemplary language (e.g., “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[00037] Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. These embodiments are provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure).
[00038] While embodiments of the present invention have been illustrated and described, it will be clear that the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the invention, as described in the claim.
[00039] Embodiment explained herein relate to a system and method to facilitate a user to try on, virtually, garments before purchasing to determine how the garments will look when actually applied to the body of the user.
[00040] FIG. 1 illustrates a network implementation, 100, of the proposed system that facilitates a user to try on, virtually, a garment before purchasing to determine how the garment will look when actually applied to the body of the user, in accordance with an embodiment of the present disclosure.
[00041] In an embodiment, the proposed system 102 is configured with one or more entities, for instance an entity/user 108 that can communicate with the system 102 using a computing device 106 operatively coupled to the system 102. The system 102 can facilitate the entity/user 108 to try on, virtually, a garment before purchasing to determine how the garment will look when actually applied to the body of the user. The system 102 can be implemented in any computing device operatively connected with a server 110. As illustrated, the system 102 can be communicatively coupled with the entity device 106 through a network 104. The entity device 106 are connected/associated to the living subject/ user /entity 108. Furthermore, the entity 108 can be such as a consumer and the like.
[00042] In an embodiment, the system 102 can be implemented using any or a combination of hardware components and software components such as a cloud, a server, a computing system, a computing device, a network device and the like. Further, the system 102 can interact with the entity device 106 through a website or an application that can reside in the entity device 106. In an implementation, the system 102 can be accessed by website or application that can be configured with any operating system, including but not limited to, AndroidTM, iOSTM, and the like. Examples of the computing device 106 can include, but not limited to, a smart camera, a smart phone, a portable computer, a personal digital assistant, a handheld device and the like. In an exemplary embodiment, the computing device 106 is a smart phone having a camera/imaging device 112. In another embodiment, the camera 112 can be operatively coupled with the computing device 106. The camera 112 can be any of a a digital camera, a standalone infrared camera, a thermal camera, a monochromatic camera and the like.
[00043] In an embodiment, the camera 112 can be used for capturing at least one of an image or a video of at least a portion, for example a top portion, a bottom portion or a full body portion, of a body of the user. For example, a length of the captured video may range from ten seconds to one minute. According to an embodiment, during pre-processing the system 102 can receive a set of data packets associated with the captured at least one of image or video from the computing device 106 and process the captured video and/or the captured image in order to determine/identify a plurality of body points associated with the body of the user based on the captured image and/or video. In an embodiment, the identified body points can be associated with two or more of a center hip, a spine, a center shoulder, head, a left shoulder, a left elbow, a left wrist, a left hand, a right shoulder, a right elbow, a right wrist, a right hand, a left hip, a left knee, a left ankle, a left foot, a right hip, a right knee, a right ankle, a right foot and the like.
[00044] In an embodiment, the system 102 can be configured to, on receipt of a selection input data from the user 108 through the computing device 106, select an image of the garment from a server 110. The server 110 may include a database which stores a plurality of garment images and any other relevant information thereof. The system 102 may be connected to the server 110 via a communication network such as the internet.
[00045] In an exemplary embodiment, the selection input data can be indicative of selecting the garment image. In another embodiment, the selection input data can include any or combination of gender of the user, one or more attributes of the garment including, but not limited to garment material specification, material type etc.
[00046] In an exemplary embodiment, the input from the user can be received in any form, such as acoustic input, speech input, tactile input, or any other input.
[00047] In an embodiment, the computing device 106 can be provided with a display that may present images and/or videos of garments and other images and videos, and/or a user interface. The user interface may be configured to receive input and/or a selection from the user. The user interface may detect gestures by the user, may detect eye movements by the user, may receive touch inputs from the user, may receive audio inputs from the user and the like.
[00048] In an embodiment, the computing device 106 can include an alphanumeric input device, e.g., a keyboard or a touch-sensitive display screen, a user interface navigation device, e.g., a mouse, a signal generation device, e.g., a speaker, a microphone etc.
[00049] In another embodiment, the display may be configured to present one or more garments items on the display. The display may retrieve images representing garments records received by the database.
[00050] In an embodiment, the system 102 can be configured for matching one or more garment points of the selected garment with corresponding one or more of the identified pluralities of body points to combine the image of the selected garment onto the image of the portion of the body of the user or onto a virtual three-dimensional model of the portion of the body of the user. For example, if the selected garment is a shirt, the garment points can be associated with a shoulder portion of the garment, a hand portion of the garments, an elbow portion of the garment, a center shoulder portion of the garment, a spine portion of the garment etc.
[00051] In an embodiment, the system 102 can be configured to generate a combined user-garment image of the image of the selected garment combined with the image of the portion of the body for displaying, at the computing device 106, the combined user-garment image to enable the user to visualize how the selected garment will look when worn by the user. Thus, the selected garment or clothing, for example a shirt, is rendered to appear as being worn, by rendering the garment accurately to the body points of the user.
[00052] In another embodiment, the system 102 can generate a combined user-garment video of the image of the selected garment combined with the virtual three-dimensional model of the portion of the body for displaying, at the computing device 106, the combined user-garment video to enable the user to visualize how the selected garment will look when worn by the user.
[00053] In an embodiment, the three-dimensional model of the at least a portion of the body can be created by the system 102 based on the identified plurality of body points of the user.
[00054] In an embodiment, the system 102 can resize the image of the selected garment for defining a shape of the garment such that the garment fits onto the portion of the body or the body of the user.
[00055] In an embodiment, the system 102 can be configured for computing a height and a width of the at least a portion of the body of a user based on the identified plurality of body points.
[00056] In an embodiment, the system 102 can facilitate selection and/or display, at the computing device 106, of another image of another garment from the database comprising the plurality of garment images on detection of movement of a right hand of the user. In an embodiment, the system 102 can facilitate selection and/or display, at the computing device 106, of an image of a previously displayed garment from the database comprising the plurality of garment images on detection of movement of a left hand of the user. The movement of the left hand and/or the right hand of the user can be detected by the user interface of the computing device 106.
[00057] Further, the network 104 can be a wireless network, a wired network or a combination thereof that can be implemented as one of the different types of networks, such as Intranet, Local Area Network (LAN), Wide Area Network (WAN), Internet, and the like. Further, the network 104 can either be a dedicated network or a shared network. The shared network can represent an association of the different types of networks that can use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like.
[00058] FIG. 2 illustrates exemplary functional components of the proposed system 102, in accordance with an embodiment of the present disclosure.
[00059] In an aspect, the system 102 may comprise one or more processor(s) 202. The one or more processor(s) 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any device that manipulates data based on operational instructions. Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute computer-readable instructions stored in a memory 204 of the system 102. The memory 204 may store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 204 may comprise any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[00060] The system 102 may also comprise an interface (s) 206. The interface (s) 206 may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface (s) 206 may facilitate communication of system 102 with various devices coupled to the system 102. The interface (s) 206 may also provide a communication pathway for one or more components of the system 102. Examples of such components include, but are not limited to, a processing engine 208 and a database 210.
[00061] The processing engine(s) 208 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 208. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 208 may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 208 may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) 208. In such examples, the system 102 may comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the system 102 and the processing resource. In other examples, the processing engine(s) 208 may be implemented by electronic circuitry.
[00062] The database 210 may comprise data that can be either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 208, and/or other required predetermined parameters data /instructions/algorithms to be used by the processors/processing engine(s) 208.
[00063] In an exemplary embodiment, the processing engine(s) 208 may comprise an identification unit 212, a selection unit 214, a matching unit 216, a garment extraction unit 218 and other units(s) 220. The other unit(s) 220 implements functionalities that supplement applications or functions performed by the processing engine(s) 208. The database 210 may serve, amongst other things, as a repository for storing data processed, received, and generated by one or more of the units.
[00064] It would be appreciated that modules being described are only exemplary modules, and any other module or submodule may be included as part of the system 102. These modules too may be merged or divided into super-modules or sub-modules as may be configured.
[00065] In an embodiment, the identification unit 212 can receives a data packet associated with an image and/or a video of a body of the user/entity from the computing device 106. For example, the received image and/or video can be sent/uploaded by user, through the computing device 102, from a memory of the computing device 106. The identification unit 212 can be configured to perform pre-processing and/or image processing on the received image and/or video to identify a plurality of body points associated with the body of the user. In an embodiment, the identified body points can include two or more of a center hip, a spine, a center shoulder, head, a left shoulder, a left elbow, a left wrist, a left hand, a right shoulder, a right elbow, a right wrist, a right hand, a left hip, a left knee, a left ankle, a left foot, a right hip, a right knee, a right ankle, a right foot and the like.
[00066] In an embodiment, the identified body points can be categorized into body types including full body portion, top body portion and bottom body portion by the identification unit 212. To categorize the identified body points into the body types, height and width of different body portions are calculated. For instance, for the full body portion, a right height can be calculated by measuring a distance from the right shoulder point to the right ankle point, a left height can be calculated by measuring a distance from the left shoulder point to the left ankle point, and a width can be calculated by measuring a distance from the right shoulder point to the left shoulder point.
[00067] In an embodiment, for instance, for the top body portion, a right height can be calculated by measuring a distance from the right shoulder point to the right hip point, a left height can be calculated by measuring a distance from the left shoulder point to the left hip point, and a width can be calculated by measuring a distance from the right shoulder point to the left shoulder point.
[00068] In an embodiment, for instance, for the bottom body portion, a right height can be calculated by measuring a distance from the right hip point to the right ankle point, a left height can be calculated by measuring a distance from the left hip point to the left ankle point, and a width can be calculated by measuring a distance from the right hip point to the left hip point.
[00069] In an embodiment, on receipt of a selection input data from the user through the computing device 106, the selection unit 214 can select an image of a garment, for example a shirt, from a plurality of garment images stored in the database 210. In an exemplary embodiment, the selection input data can be indicative of selecting a particular garment image from the plurality of garment images. In another embodiment, the selection input data can also include any or combination of gender of the user, one or more attributes of the garment including, but not limited to garment material specification, material type etc.
[00070] In an embodiment, the plurality of garment images can be associated with different garments categories including full garment, such as but not limited to suit, saree, sherwani, etc., top garment such as but not limited to shirt, t-shirt, jacket, etc., and bottom garment such as but not limited to trousers, jeans, etc. In an embodiment, the plurality of garment images can be of various garments with different patterns and colors.
[00071] In another embodiment, the selection unit 214 can generates one or more recommendations of garments from the plurality of garment images for the user based on the identified body points of the user. The recommended garment images ca be displayed on the computing device 106.
[00072] In an exemplary embodiment, for example, as shown in FIG. 3, the garment, such as a shirt, draped on a mannequin with single color background can be photographed and digitized for storage in the database 210. The garment extraction module 218 can extract the garment from the digitized picture. It involves marking out just the garment and removing all other areas such as mannequin, background, etc. This extracted garment image can be categorized and stored in the garment library database 210 for further use.
[00073] In an embodiment, the matching unit 216 can be configured for matching one or more garment points of the selected garment with corresponding one or more of the identified body points to map/combine the image of the selected garment onto the image of the user body or onto a virtual three-dimensional model of the body of the user. For example, if the selected garment is a shirt, the garment points can be associated with a shoulder portion of the garment, a hand point of the garments, an elbow point of the garment, a center shoulder point of the garment, a spine point of the garment etc.
[00074] In an embodiment, the matching unit 216 can generate the virtual three-dimensional body model of the user based on the identified body points and/or the calculated height and width. The matching unit 216 can enable display of the three-dimensional body model on the display of the computing device 106.
[00075] In an embodiment, the identified body points of the user image can be forwarded for pose matching process, if the database is matched with the image, then it is forward to the posture recognition process and from that the user three-dimensional model is created. If the database is not matched with the image, then it goes back to update a pose database, which can be associated with the database 210, and again the pose database goes for the database matching process.
[00076] In an embodiment, the matching unit 216 can generate a combined user-garment image of the image of the selected garment combined with the image of the user body for displaying, on the computing device 106, the combined user-garment image to enable the user to visualize how the selected garment will look when worn by the user, as shown in FIG.4. Thus, the selected garment or clothing, for example a shirt, is rendered to appear as being worn by the user, by rendering the garment accurately to body points of the user.
[00077] In another embodiment, the system 102 can generate a combined user-garment video of the image of the selected garment combined with the virtual three-dimensional model of the body of the user for displaying, on the computing device 106, the combined user-garment video to enable the user to visualize how the selected garment will look when worn by the user.
[00078] In an embodiment, the matching unit 216 can resize the image of the selected garment for defining a shape of the garment such that the garment fits onto the image of the body of the user. For instance, if a length of hands of the garment in the garment image is shorter than the length of user hands in the image of the user body, the matching unit 216 can resize image of the selected garment to enable absolute matching of the garment points with the corresponding user body points. Therefore, the selected garment or apparel is rendered to appear as being worn by the user, by rendering the garments accurately to the body points of the user.
[00079] In an embodiment, the matching unit 216 can dynamically map the garment points to body points of a new user with different body size. This enables seamless experience to users with varying height and width.
[00080] In an embodiment, the matching unit 216 can match the body points according to garment category and the garment is successfully draped on the user three-dimensional body model or image.
[00081] In an embodiment, the matching unit 216 can facilitate selection and/or display of another image of another garment from the plurality of garment images on detection of movement of a right hand of the user. For example, if a red color shirt image is displayed on the computing device, the matching unit 216 can facilitate selection and display of an image of a green color shirt from the plurality of garment images on detection of movement of the right hand of the user in between hip points of the user. In an embodiment, the matching unit 216 can facilitate selection and/or display, on the computing device 106, an image of a previously displayed garment from the plurality of garment images on detection of movement of a left hand of the user. For example, if a green color shirt image is displayed on the computing device, the matching unit 216 can facilitate display of an image of a red color shirt, which was displayed before the green color shirt on detection of movement of the left hand of the user in between the hip points of the user. The change in location of the right hand/wrist from starting point can be traced by the user interface of the computing device 206 and therefore the matching unit 216 can detects user’s need to change the garment image. The selection unit 214 can be activated and fetches the next garment image from the database 210. Similarly, motion of the left hand can fetch the previous garment image from the database 210. Therefore, the user can interactively select garments from the garment library in the database 210 by moving of the hand from left to right or vice versa.
[00082] FIG. 5 illustrates an exemplary flow diagram of working of the proposed system, in accordance with an embodiment of the present disclosure. In an embodiment, through the user interface of the computing device 106, the user can select a gender of the user and after that the user can select a garment category from the garment library in the database 210 and further a garment image. The garment image selected by the user is displayed on the computing device screen. Further, the user gets two options including a try on option and a garment visualisation option. If the user chooses the try on option by clicking on a try on button on the user interface of the computing device, then the camera/webcam gets initialize and an image or video of the user body is captured. Further body points of the user body are determined. Further, the selected garment image is fetched from the garment library and resided according to the determined body points of the user and further the resized garment image is draped onto the user image or onto the three-dimensional body model of the user. If the selected garment fits and suits to user, the user can further proceed to purchase the garments after adding to cart. And if the selected garment doses not fit and suits to user, the step of selection of the garment category is repeated.
[00083] In another embodiment, if the user chooses garment visualization option by clicking on the selected garment displayed on the user interface of the computing device, this can open the garment specific page, and four standard garment images can be displayed on the computing device. Further user can zoom the images to see large size of images. In addition, detail information of the garment can be displayed on the computing device.
[00084] FIG. 6 illustrates a flow diagram of the proposed method that facilitates a user to try on, virtually, one or more garments before purchasing to determine how the garments will look when actually applied to the body of the user, in accordance with an embodiment of the present disclosure.
[00085] In an aspect, the proposed method may be described in the general context of computer-executable instructions. Generally, computer-executable instructions include routines, programs, objects, components, data structures, procedures, modules, functions, etc. that perform particular functions or implement particular abstract data types. The method can also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer-executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[00086] The order in which the method as described is not intended to be construed as a limitation and any number of the described method blocks may be combined in any order to implement the method or alternate methods. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method may be considered to be implemented in the above-described earbud system.
[00087] In an embodiment, the method 600 can include at a step 602 capturing at least one of an image and video of at least a portion of a body of a user through a computing device associated with a camera, and at a step 604 identifying a plurality of body points associated with the body of the user based on the captured at least one of the image and video through one or more processors. Further, the method 600 can include at a step 606 selecting an image of the garment from a database comprising a plurality of garment images on receipt of a selection input from the user through the computing device and at step a 608 matching one or more garment points of the selected garment with corresponding one or more of the identified plurality of body points through the one or more processors.
[00088] Further, the method 600 can include at a step 610 combining the selected image of the garment onto the image of the at least a portion of the body of the user or onto a virtual three-dimensional model of the at least a portion of the body of the user, through the one or more processors, at a step 612 generating a combined user-garment image of the selected image of the garment combined with the image of the at least a portion of the body through the one or more processors, and at a step 614 displaying, through the computing device, the combined user-garment image to enable the user to visualize how the selected garment will look when worn by the user.
[00089] In an embodiment, the method 600 can include a step of generating, through the one or more processors, a combined user-garment video of the selected image of the garment combined with the virtual three-dimensional model of the at least a portion of the body and a step of displaying, through the computing device, the combined user-garment video to enable the user to visualize how the selected garment will look when worn by the user.
[00090] In an embodiment, the method 600 can include a step of resizing, based on the identified plurality of body points, the selected image of the garment for defining a shape of the selected garment such that the selected garment fits onto the body of the user.
[00091] In an embodiment, the method 600 can include a step of computing a height and a width of at least a portion of the body of a user based on the identified plurality of body points.
[00092] In an embodiment, the method 600 can include steps of selecting and/or displaying, through the computing device, another image of another garment from the database comprising the plurality of garment images on detection of movement of a right hand of the user, and selecting and/or displaying, through the computing device, an image of a previously displayed garment from the database comprising the plurality of garment images on detection of movement of the user left hand.
[00093] Thus, it will be appreciated by those of ordinary skill in the art that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named.
[00094] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE PRESENT DISCLOSURE
[00095] The present disclosure provides a system and method for visually representing to a user appearance of one or more garments if worn by that user based on physical attributes of the user.
[00096] The present disclosure provides a simple and cost-effective system and method to facilitate a user to try on, virtually, a garment before purchasing to determine how the garment will look when actually applied to the body of the user.
[00097] The present disclosure provides an easy to use and efficient system and method which allow an individual to digitally try out different apparels, based on body parameters of the individual, to estimate how well the apparels suits that individual before the individual potentially purchases the apparels.

Documents

Application Documents

# Name Date
1 202121028761-STATEMENT OF UNDERTAKING (FORM 3) [26-06-2021(online)].pdf 2021-06-26
2 202121028761-FORM 1 [26-06-2021(online)].pdf 2021-06-26
3 202121028761-DRAWINGS [26-06-2021(online)].pdf 2021-06-26
4 202121028761-DECLARATION OF INVENTORSHIP (FORM 5) [26-06-2021(online)].pdf 2021-06-26
5 202121028761-COMPLETE SPECIFICATION [26-06-2021(online)].pdf 2021-06-26
6 202121028761-Proof of Right [27-08-2021(online)].pdf 2021-08-27
7 202121028761-FORM-26 [27-08-2021(online)].pdf 2021-08-27
8 Abstract1..jpg 2021-12-09
9 202121028761-FORM 18 [23-06-2025(online)].pdf 2025-06-23