Sign In to Follow Application
View All Documents & Correspondence

Rendering Wearable Items In A Digital Environment

Abstract: In an embodiment, a method of rendering wearable items in a digital environment is disclosed. The method comprises creating a 3D model of a wearable item based on one or more predetermined techniques. The method further comprises capturing motion data associated with a body part of the individual. Further, the method comprises, rendering, in real-time in the digital environment, the 3D model of the wearable item on one of: (a) the body part of the individual and (b) a 3D model of the body part of the individual, based on the motion data.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
07 February 2018
Publication Number
02/2020
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
mail@lexorbis.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-03-15
Renewal Date

Applicants

SHILPMIS TECHNOLOGIES PVT. LTD.
SHILP MAITRI HOUSE, BHATAR CHAR RASTA, BESIDE GEETHA RESTAURANT, SURAT-395 007, GUJARAT, INDIA.

Inventors

1. SHILPMIS TECHNOLOGIES PVT. LTD.
SHILP MAITRI HOUSE, BHATAR CHAR RASTA, BESIDE GEETHA RESTAURANT, SURAT-395 007, GUJARAT, INDIA.

Specification

DESC:CROSS - REFERENCE

The specification is a cognate specification of IN Applications 201821004657, filed on 7th Feb, 2018, and 201821004783, filed on 8th Feb, 2018.

FIELD OF THE INVENTION

The present subject matter relates to rendering digital content in a digital environment and, more particularly, relates to rendering wearable items in a digital environment.

BACKGROUND

With advancement in technology, individuals nowadays are provided with options to view and/or perform virtual/digital trials of wearable objects, such as jewelry items and accessory items. In such virtual trials, an individual interacts with the wearable objects in a virtual or digital environment, to gain better understanding of various aesthetic characteristics of such wearable objects.
Typical methods of facilitating such viewing and/or virtual/digital trials involve rendering a 3D model of a wearable item over an interface, such as a web browser. The individual is typically provided with an option of rotating or zooming the 3D model for understanding the aesthetic characteristics of the corresponding wearable item. Although a 3D model of the wearable item is rendered, the interface through which such a model is viewed, in effect, renders a static view only. For instance, the web browser renders a static or 2D effect in relation to the 3D model of the wearable item, at a given instant. Owing to such rendering of the 3D models of the wearable item, the individual may gain inadequate understanding of the aesthetic characteristics of the wearable item. This may further lead to less immersive experience for the individual.
In another conventional method, the 3D models of the wearable items are rendered on body parts of pre-stored digital models, for example, mannequins. Again, such rendering results in limited or low quality aesthetics related to the wearable items and the body parts. Thus, the individual may gain an inadequate understanding of the aesthetic characteristics of the wearable items with respect to the corresponding body part.
Thus, there is a need for a solution that overcomes the above deficiencies.

SUMMARY

This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify key or essential inventive concepts of the invention and nor is it intended for determining the scope of the invention.
In an embodiment, a method of rendering wearable items in a digital environment is disclosed. The method comprises creating a 3D model of a wearable item based on one or more predetermined techniques. The method further comprises capturing motion data associated with a body part of the individual. Further, the method comprises, rendering, in real-time in the digital environment, the 3D model of the wearable item on one of: (a) the body part of the individual and (b) a 3D model of the body part of the individual, based on the motion data.

In another embodiment, a system for rendering wearable items in a digital environment is disclosed. The system comprises a processor and a motion sensing module coupled to the processor. The motion sensing module is configured to receive motion data associated with a body part of the individual. The system further includes a rendering module coupled to the processor. The rendering module is configured to create a 3D model of a wearable item based on one or more predetermined techniques. Further, the rendering module is configured to render, in real-time in the digital environment, the 3D model of the wearable item on one of: (a) the body part of the individual and (b) a 3D model of the body part of the individual, based on the motion data.
To further clarify advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
Figure 1 illustrates a schematic block diagram of a system for rendering wearable items in a digital environment, according to an embodiment of the present subject matter;
Figure 2 illustrates a method of rendering wearable items in a digital environment, according to an embodiment of the present subject matter; and
Figure 3(a)-(d) illustrates an example use case of rendering wearable items in a digital environment, according to an embodiment of the present subject matter.
Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present invention. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.

DETAILED DESCRIPTION OF FIGURES

For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are explanatory of the invention and are not intended to be restrictive thereof.
Reference throughout this specification to “an aspect”, “another aspect” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises... a" does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skilled in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
Embodiments of the present invention will be described below in detail with reference to the accompanying drawings.
Figure 1 illustrates a schematic block diagram of a system 100, according to an embodiment of the present subject matter. According to aspects of the present subject matter, the system 100 may be implemented for rendering wearable items in a digital environment. Without limitation, examples of the wearable items may include jewelry items, accessory items, watches, and the like. Without limitation, examples of the digital environment may include a Virtual Reality (VR) environment, an Augmented Reality (AR) environment, and a combination thereof. The digital environment mentioned herein, may be rendered on or using a computing device. Examples of the computing device may include, but are not limited to, a smartphone, a tablet, a laptop, a desktop, a server, a cloud server, a standalone VR device, a standalone AR device, a standalone mixed reality device. Furthermore, one or more of the aforementioned computing devices may operate with each other to implement the aspects of the present subject matter, as described herein.
In an example, the system 100 may facilitate selection of a wearable item from amongst a plurality of wearable items. Post selection of the wearable item, the system 100 is configured to render, in real-time in a digital environment, a 3D model of the wearable item. In an implementation where the system 100 is implemented in a computing device operating in an AR mode, the system 100 renders the 3D model of the wearable item on a body part of the individual. In another implementation where the system 100 is implemented in a computing device operating in VR mode or mixed reality mode, the system 100 renders the 3D model of the wearable item on a 3D model of the body part of the individual.
In an example implementation, the system 100 includes a processor 102, memory 104, a motion tracking module 106, a rendering module 108, an updation module 110, and data 112. The motion tracking module 106, the rendering module 108, and the updation module 110 are coupled to the processor 102. In an embodiment, one or more of the aforementioned components of the system 100 may be implemented in a single computing device. In another embodiment, one or more of the aforementioned components of the system 100 may be implemented in a distributed manner across more than one computing device. As an example, the motion tracking module 106 and the updation module 110 may be implemented on a single computing device and the rendering module 108 may be implemented on a server or a cloud server.
The processor 102 can be a single processing unit or a number of units, all of which could include multiple computing units. The processor 102 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, graphical processing units, neural processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 102 is configured to fetch and execute computer-readable instructions and data stored in the memory 104.
The memory 104 may include any non-transitory computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read-only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
In an example, the motion tracking module 106, the rendering module 108, and the updation module 110, amongst other things, includes routines, programs, objects, components, data structures, etc., which perform particular tasks or implement data types. The motion tracking module 106, the rendering module 108, and the updation module 110 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions. Further, the motion tracking module 106, the rendering module 108, and the updation module 110 can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, such as the processor 102, a state machine, a logic array or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform the required tasks or, the processing unit can be dedicated to perform the required functions. In another aspect of the present disclosure, the motion tracking module 106, the rendering module 108, and the updation module 110 may be machine-readable instructions (software) which, when executed by a processor/processing unit, perform any of the described functionalities.
The data 112 serves, amongst other things, as a repository for storing data processed, received, and generated by one or more of the processor 102, the motion tracking module 106, the rendering module 108, and the updation module 110.
As mentioned above, the system 100 is configured to render wearable items in a digital environment. Accordingly, in an implementation, the rendering module 108 is configured to display a catalogue of a plurality of wearable items to the user. The display herein may include a display of a computing device, such as a smartphone, a tablet, a personal digital assistant (PDA), and the like. Other examples of the display may include a display screen of a VR device, an AR device, and the like. Furthermore, the display may include an LCD and an LED television. The catalogue may include various categories under which the wearable items may be grouped. From the catalogue, an individual seeking to purchase a wearable item may select an appropriate category corresponding to the wearable item. Upon receiving the selection of the category, the rendering module 108 may display all the wearable items classified under the category. Subsequently, the individual may then select a wearable item from the wearable items displayed in the category.
In an implementation, upon receiving the selection of the wearable item from the user, the rendering module 108 is configured to create a 3D model of the wearable item based on one or more predetermined techniques. As may be gathered, the rendering module 108 may select the predetermined technique based on the digital environment in which the 3D model of the wearable item is to be rendered. Thus, the 3D model of the wearable item may be one of a VR 3D model, an AR 3D model, a mixed reality 3D model.
In an implementation, a camera 114 is configured to capture a plurality of images of a body part of the individual. In an example, the rendering module 108 is configured to cause display of a notification on the display. Via the notification, the individual is notified to appropriately place a body part within a predefined region in vicinity to the camera 114. As may be gathered, the rendering module 108 selects the body part depending upon a type of the wearable item selected by the individual. For instance, in a case where the wearable item is a ring, the rendering module 108 may cause display of a notification to place a hand in vicinity to the camera 114. In another implementation, the individual may himself place his body part in vicinity to the camera 114, upon viewing a prompt related thereto.
In an implementation, the rendering module 108 is configured to receive the plurality of images from the camera 114. Subsequent to receiving the images, the rendering module 108 is configured to create a 3D model of the body part based on the images and a texture mapping technique. As would be appreciated, the 3D model of the body part of the individual would be different than pre-stored 3D models of body parts. Examples of the 3D model of the body part may include but are not limited to a VR 3D model and a mixed reality 3D model.
As a part of the implementing the texture mapping technique, the rendering module 108 is configured to de-noise the plurality of images. Post de-noising of the images, the rendering module 108 is configured to determine a texture of the body part based on the images. For determining the texture, for each of the images, the rendering module 108 at first reads said image followed by segmenting of said image. Thereafter, the rendering module 108 removes one or more noise elements from said image. Subsequently, the rendering module 108 detects a center of the body part in said image using distance transform. Further, the rendering module 108 detects a threshold point related to the body part in said image. Upon detecting the threshold point, the rendering module 108 crops an area of the body part in said image based on the threshold point. Post cropping, the rendering module 108 draws an average circle relative to the body part in said image based on a predetermined ratio. Thereafter, the rendering module 108 is configured to read the texture of an image section relative to the average circle. Post reading of the texture, the rendering module 108 removes the image section and detects other image section associated with the body part in said image. Subsequently, the rendering module 108 reads the texture in the other image section associated with the body part.
Upon determining the texture, the rendering module 108 is configured to apply the texture on a pre-stored 3D model to create the 3D model of the body part of the individual.
In an implementation, a motion tracker 116 is configured to detect the body part of the individual. For instance, when the individual brings the body part in vicinity to the motion tracker 116, the motion tracker 116 detects the body part. Subsequently, the motion tracker 116 monitors a motion of the body part of the individual. Accordingly, the motion tracker 116 provides motion data indicative of movement of the body part of the individual to the system 100. In an implementation, the motion tracking module 106 is configured to receive motion data associated with the body part of the individual. The motion tracking module 106 is configured to identify the movements of the body part based on the motion data and provide data related to the movement to the rendering module 108.
In an implementation, the rendering module 108 is configured to render, in real-time in the digital environment, the 3D model of the wearable item on the 3D model of the body part of the individual, based on the motion data. As an example, the individual may rotate his body part. Accordingly, the rendering module 108 is configured to rotate the 3D model of the wearable item along with the rotation of the body part of the individual.
The aforementioned description relates to an example embodiment where the system 100 is implemented using one or more computing devices operating in AR mode. In another embodiment, where the system 100 is implemented using one or more computing devices operating in AR mode, the rendering module 108 is configured to render, in real-time in the digital environment, the 3D model of the wearable item on the body part of the individual. In said embodiment, the system 100 renders a live feed of the body part of the individual, where the 3D model of the wearable item is superimposed over the body part in the live feed. In another example, a recorded feed of the body part may be used. In said example, the 3D model of the wearable item is superimposed on the body part of the individual as visible in the live feed.
In an example, the individual may try out multiple wearable items. On each trial, the system 100 is configured to store data related to the trial as historic data. Accordingly, in an implementation, the system 100 is configured to provide the user with an option to compare different trials of the wearable items, simultaneously. In said implementation, the rendering module 108 is configured to receive a user input to view historic data associated with rendering of other wearable items. In said implementation, a category of the other wearable items is same as a category of the wearable item. Upon receiving the user input, the rendering module 108 is configured to render a digital image of at least one other wearable item simultaneously with the rendering of the 3D model of the wearable item. Thus, the individual is able to view a plurality of wearable items simultaneously. In one example, the rendering module 108 may display the 3D models of the wearable items on the body part or the 3D model of the body part in a split screen mode.
In an implementation, the system 100 facilitates the individual to customize the wearable item. In said implementation, the updation module 110 is configured to receive a user input for customizing an attribute of the wearable item. The attribute of the wearable item, as would be gathered, is related to the type of the wearable item. For instance, in case of a ring, the attribute may include a type of material of the ring, a size and shape of a stone to be used with the ring, and the like. Upon receiving the user input, the updation module 110 is configured to provide one or more customization options associated with the attribute of the wearable item. As may be gathered, the customization options are provided on the display, in one example. Subsequently, the updation module 110 is configured to receive a selection of at least one customization option from the one or more customization options, as selected by the individual. Subsequently, the updation module 110 is configured to update the attribute of the wearable item based on the at least one customization option.
In an implementation, the rendering module 108 is configured to update the rendering of the 3D model of the wearable item based on a direction of movement of the body part of the individual. At first, the rendering module 108 detects, for example, using the motion data, the direction of movement of the body part. Once the direction is detected, in said implementation, the rendering module 108 may access a pre-stored mapping table stored in the data 112. The pre-stored mapping table includes a mapping between a plurality of directions and a plurality of pre-defined actions in respect of the 3D model of the wearable item and/or the body part/3D model of the body part. The rendering module 108 identifies the pre-defined action corresponding to the detected direction and subsequently performs the identified pre-defined action to update the rendering of the 3D model of the wearable item. As an example, if the individual moves his body part in the upward direction, the rendering module 108 is configured to zoom in the 3D model of the wearable item and/or the body part/3D model of the body part.
As may be gathered from above, the system 100 provides for an improved technique for facilitating digital/virtual trials of wearable items. By implementing the texture mapping algorithm, as described herein, the system 100 increases an accuracy of rendering 3D model corresponding to the body parts of the individual. In other words, as the texture of the individual’s body part is captured and applied onto pre-stored 3D model, a real-life rendering of body part is achieved digitally. With subsequent superimposition of the 3D model of the wearable item on the 3D model of the body part, the individual is able to gain a better understanding of aesthetics of the wearable item on corresponding body part. Furthermore, by facilitating digital trial of wearable items, a vendor of wearable item need not maintain all the stock of wearable items in a single place. Thus, in places where security is a concern, the vendor may choose to implement system 100 and store partially/completely, his stock at a safe location.
Figure 2 illustrates a method 200 of of rendering wearable items in a digital environment, according to an embodiment of the present subject matter. The method 200 may be implemented in the system 100, using components thereof, as described above. Further, for the sake of brevity, details of the present subject matter that are explained in detail with reference to the description of Figure 1 above are not explained in detail herein.
At step 202, a 3D model of a wearable item is created based on one or more predetermined techniques. In an example, an individual seeking to do a virtual or digital trial of wearable items may be presented with a plurality of wearable items. The individual may then select a wearable item from the plurality of wearable items. Upon receiving the selection of the wearable item from the individual, the 3D model of the wearable item is created based on the one or more predetermined techniques. In one example, the predetermined techniques include existing image processing techniques for creating 3D models.
At step 204, motion data associated with a body part of the individual is captured. In addition to the creation of the 3D model of the wearable item in the above step, data related to a body part of the individual is also captured. The data that is captured is based on whether the 3D model of the wearable item is to be rendered on the body part or the 3D model of the part of the individual. In a case where the method is implemented using a computing device(s) operating in AR mode, data related to a live feed of the body part may be captured. In another implementation where the method is implemented on a computing device(s) operating in VR mode or mixed reality mode, a plurality of images of the body part of the individual is captured for creating a 3D model of the body part of the individual, as described above in the figure 1.
Continuing with the above step, in an implementation, at first, the body part of the individual is detected. Post detection of the body part, motion data indicative of movement of the body part is recorded or captured.
At step 206, the 3D model of the wearable item is rendered in real-time in a digital environment on one of the body part and the 3D model of the body part of the individual, based on the motion data. In an example, the rendering of the 3D model of the wearable item is performed using at least one of a VR device, an AR device, a computing device, a monitor, and a television. Accordingly, in an example, the 3D model of the body part of the individual is one of a VR 3D model, and a mixed reality 3D model. Furthermore, the 3D model of the wearable item is one of a VR 3D model, AR 3D model, and a mixed reality 3D model.
In an implementation, a user input to view historic data associated with rendering of other wearable items is received. Herein, a category of the other wearable items is same as a category of the wearable item. In said implementation, a digital image of at least one other wearable item is rendered simultaneously with the rendering of the 3D model of the wearable item. Thus, the individual is able to view trials of multiple 3D wearable items simultaneously.
In an implementation, customization of the wearable item is also facilitated. In said implementation, a user input for customizing an attribute of the wearable item is received. The attribute of the wearable item, as would be gathered, is related to the type of the wearable item. For instance, in case of a ring, the attribute may include a type of material of the ring, a size and shape of a stone to be used with the ring, and the like. Upon receiving the user input, one or more customization options associated with the attribute of the wearable item are provided to the individual. As may be gathered, the customization options are provided on the display, in one example. Subsequently, a selection of at least one customization option from the one or more customization options, as selected by the individual, is received. Accordingly, the attribute of the wearable item is updated based on the at least one customization option.
In an implementation, the rendering of the 3D model of the wearable item may be updated based on a direction of movement of the body part of the individual. In said implementation, the direction of movement of the body part is detected. Accordingly, using a pre-stored mapping table, a pre-defined action corresponding to the detected direction is performed. The pre-defined action is in respect of the 3D model of the wearable item and/or the body part/3D model of the body part. As an example, if the individual moves his body part in the upward direction, the rendering module 108 is configured to zoom in the 3D model of the wearable item and/or the body part/3D model of the body part.
Figures 3(a)-(d) illustrate an example use case, according to an embodiment of the present subject matter. In said use case, implementation of the texture mapping technique is shown, whereby a 3D model of a hand of an individual is created. Subsequently, superimposition of a jewelry item, such as a ring, on the 3D model is shown.
Referring to 3(a), an image 300 and an image 302 of the hand of the individual is captured, for example, using the camera 114. As shown, the image 300 corresponds to a front side or a back-side of the hand. The image 302 corresponds to a back side of the hand of the individual. Post capturing of the images 300 and 302, the texture mapping technique, as described in Figure is implemented on the images 300 and 302. Accordingly, as shown in figure 3(b), an image 304 and an image 306 are obtained. The images 304 and 306 represents texture maps created based on the images 300 and 302, respectively.
Subsequent to the creation of the texture maps, the texture maps are applied on pre-stored 3D hand models. Accordingly, as shown in figure 3(c), a 3D model of the hand of the individual is created. Figure 3(c) shows images 308 and 310 illustrating different views of the 3D model of the hand of the individual.
In an implementation, the 3D model of the ring is superimposed on the 3D model of the hand, in real-time, in a digital environment. Figure 3(d) shows different views 312, 314, and 316 of the 3D model of the ring being superimposed on the 3D model of the hand.
While specific language has been used to describe the present disclosure, any limitations arising on account thereto, are not intended. As would be apparent to a person in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein. The drawings and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment.
,CLAIMS:1. A method of rendering wearable items in a digital environment, the method comprising:
creating a 3D model of a wearable item based on one or more predetermined techniques;
capturing motion data associated with a body part of the individual; and
rendering, in real-time in the digital environment, the 3D model of the wearable item on one of: (a) the body part of the individual and (b) a 3D model of the body part of the individual, based on the motion data.
2. The method as claimed in claim 1, further comprising:
capturing a plurality of images of the body part of the individual;
creating the 3D model of the body part based on the plurality of images and a texture mapping technique.
3. The method as claimed in claim 2, wherein the method further comprises:
de-noising the plurality of images;
determining a texture of the body part based on the plurality of images; and
applying the texture on a sample 3D model to create the 3D model of the body part.
4. The method as claimed in claim 3, wherein the method further comprises:
reading each image from the plurality of images;
segmenting said image;
removing one or more noise elements from said image;
detecting a center of the body part in said image using distance transform;
detecting a threshold point related to the body part in said image;
cropping an area of the body part in said image based on the threshold point;
drawing an average circle relative to the body part in said image based on a predetermined ratio;
reading the texture of an image section relative to the average circle; and
removing the image section;
detecting other image section associated with the body part in said image; and
reading the texture in the other image section associated with the body part.
5. The method as claimed in claim 1, wherein the rendering of the 3D model of the wearable item is performed using at least one of a VR device, an AR device, a computing device, a monitor, and a television.
6. The method as claimed in claim 1, wherein the method further comprises:
receiving a user input to view historic data associated with rendering of other wearable items, wherein a category of the other wearable items is same as a category of the wearable item; and
rendering a digital image of at least one other wearable item simultaneously with the rendering of the 3D model of the wearable item.
7. The method as claimed in claim 1, wherein the method further comprises:
receiving a user input for customizing an attribute of the wearable item;
providing one or more customization options associated with the attribute of the wearable item;
receiving a selection of at least one customization option from the one or more customization options; and
updating the attribute of the wearable item based on the at least one customization option.
8. The method as claimed in claim 1, wherein the 3D model of the body part of the individual is one of a Virtual Reality (VR) 3D model, and a mixed reality 3D model.
9. The method as claimed in claim 1, wherein the 3D model of the wearable item is one of a Virtual Reality (VR) 3D model, Augmented Reality (AR) 3D model, and a mixed reality 3D model.
10. The method as claimed in claim 1, wherein the method further comprises:
detecting, in real-time, a direction of movement of the body part of the individual;
identifying a pre-defined action corresponding to the direction of movement of the body part of the individual based on a pre-stored mapping table, wherein the pre-stored mapping table comprises a mapping between a plurality of directions and a plurality of pre-defined actions in respect of the 3D model of the wearable item and/or the body part/3D model of the body part; and
performing the identified pre-defined action to update the rendering of the 3D model of the wearable item.
11. A system for rendering wearable items in a digital environment, the system comprising:
a processor;
a motion sensing module coupled to the processor, wherein the motion sensing module is configured to receive motion data associated with a body part of the individual;
a rendering module coupled to the processor, wherein the rendering module is configured to:
create a 3D model of a wearable item based on one or more predetermined techniques; and
render, in real-time in the digital environment, the 3D model of the wearable item on one of: (a) the body part of the individual and (b) a 3D model of the body part of the individual, based on the motion data.
12. The system as claimed in claim 11, wherein the rendering module is further configured to:
receive a plurality of images of the body part of the individual; and
create the 3D model of the body part based on the plurality of images and a texture mapping technique.
13. The system as claimed in claim 12, wherein the rendering unit is further configured to:
de-noise the plurality of images;
determine a texture of the body part based on the plurality of images; and
apply the texture on a sample 3D model to create the 3D model of the body part.
14. The system as claimed in claim 13, wherein the rendering unit is further configured to:
read each image from the plurality of images;
segment said image;
remove one or more noise elements from said image;
detect a center of the body part in said image using distance transform;
detect a threshold point related to the body part in said image;
crop an area of the body part in said image based on the threshold point;
draw an average circle relative to the body part in said image based on a predetermined ratio;
read the texture of an image section relative to the average circle;
remove the image section;
detect other image section associated with the body part in said image; and
read the texture in the other image section associated with the body part.
15. The system as claimed in claim 11, wherein the rendering unit is further configured to:
receive a user input to view historic data associated with rendering of other wearable items, wherein a category of the other wearable items is same as a category of the wearable item; and
render a digital image of at least one other wearable item simultaneously with the rendering of the 3D model of the wearable item.
16. The system as claimed in claim 11, wherein the system further comprises an updation module coupled to the processor, wherein the updation module is configured to:
receive a user input for customizing an attribute of the wearable item;
provide one or more customization options associated with the attribute of the wearable item;
receive a selection of at least one customization option from the one or more customization options; and
update the attribute of the wearable item based on the at least one customization option.
17. The system as claimed in claim 11, wherein the 3D model of the body part of the individual is one of a Virtual Reality (VR) 3D model, and a mixed reality 3D model.
18. The system as claimed in claim 11, wherein the 3D model of the wearable item is one of a Virtual Reality (VR) 3D model, Augmented Reality (AR) 3D model, and a mixed reality 3D model.
19. The system as claimed in claim 11, wherein the rendering module is further configured to:
detect, in real-time, a direction of movement of the body part of the individual;
identify a pre-defined action corresponding to the direction of movement of the body part of the individual based on a pre-stored mapping table, wherein the pre-stored mapping table comprises a mapping between a plurality of directions and a plurality of pre-defined actions in respect of the 3D model of the wearable item and/or the body part/3D model of the body part; and
perform the identified pre-defined action to update the rendering of the 3D model of the wearable item.

Documents

Application Documents

# Name Date
1 201821004657-FORM 13-11-05-2018.pdf 2018-05-11
2 201821004657-FORM 1-11-05-2018.pdf 2018-05-11
3 201821004657-CERTIFICATE-11-05-2018.pdf 2018-05-11
4 201821004657-FORM28-070218.pdf 2018-12-27
5 201821004657-Form 2(Title Page)-070218.pdf 2018-12-27
6 201821004657-Form 1-070218.pdf 2018-12-27
7 201821004657-DRAWING [07-02-2019(online)].pdf 2019-02-07
8 201821004657-COMPLETE SPECIFICATION [07-02-2019(online)].pdf 2019-02-07
9 201821004657-RELEVANT DOCUMENTS [23-04-2019(online)].pdf 2019-04-23
10 201821004657-FORM 13 [23-04-2019(online)].pdf 2019-04-23
11 201821004657-AMENDED DOCUMENTS [23-04-2019(online)].pdf 2019-04-23
12 Abstract1.jpg 2019-06-14
13 201821004657-ORIGINAL UR 6(1A) FORM 26-300419.pdf 2019-09-24
14 201821004657-FORM FOR STARTUP [02-09-2020(online)].pdf 2020-09-02
15 201821004657-FORM 18 [02-09-2020(online)].pdf 2020-09-02
16 201821004657-EVIDENCE FOR REGISTRATION UNDER SSI [02-09-2020(online)].pdf 2020-09-02
17 201821004657-CORRECTED PAGES [02-09-2020(online)].pdf 2020-09-02
18 201821004657-FER.pdf 2021-10-18
19 201821004657-OTHERS [11-02-2022(online)].pdf 2022-02-11
20 201821004657-FER_SER_REPLY [11-02-2022(online)].pdf 2022-02-11
21 201821004657-CLAIMS [11-02-2022(online)].pdf 2022-02-11
22 201821004657-ABSTRACT [11-02-2022(online)].pdf 2022-02-11
23 201821004657-US(14)-HearingNotice-(HearingDate-11-01-2024).pdf 2023-12-21
24 201821004657-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [08-01-2024(online)].pdf 2024-01-08
25 201821004657-US(14)-ExtendedHearingNotice-(HearingDate-12-02-2024).pdf 2024-01-10
26 201821004657-Correspondence to notify the Controller [09-02-2024(online)].pdf 2024-02-09
27 201821004657-FORM-26 [10-02-2024(online)].pdf 2024-02-10
28 201821004657-Written submissions and relevant documents [27-02-2024(online)].pdf 2024-02-27
29 201821004657-PETITION UNDER RULE 137 [27-02-2024(online)].pdf 2024-02-27
30 201821004657-MARKED COPY [27-02-2024(online)].pdf 2024-02-27
31 201821004657-CORRECTED PAGES [27-02-2024(online)].pdf 2024-02-27
32 201821004657-PatentCertificate15-03-2024.pdf 2024-03-15
33 201821004657-IntimationOfGrant15-03-2024.pdf 2024-03-15
34 201821004657-FORM FOR STARTUP [13-06-2024(online)].pdf 2024-06-13

Search Strategy

1 SearchHistory(24)AE_28-09-2022.pdf
2 2021-07-0515-57-15E_05-07-2021.pdf

ERegister / Renewals

3rd: 13 Jun 2024

From 07/02/2020 - To 07/02/2021

4th: 13 Jun 2024

From 07/02/2021 - To 07/02/2022

5th: 13 Jun 2024

From 07/02/2022 - To 07/02/2023

6th: 13 Jun 2024

From 07/02/2023 - To 07/02/2024

7th: 13 Jun 2024

From 07/02/2024 - To 07/02/2025

8th: 21 Jan 2025

From 07/02/2025 - To 07/02/2026