Abstract: Present disclosure generally relates to field of Virtual Reality (VR)/Augmented Reality (AR) experiences in metaverse environment, particularly to apparatus and method for providing at least one of Virtual Reality (VR) and Augmented Reality (AR) try-on experience of product in metaverse environment. The apparatus displays composable objects corresponding to products, to user. Further, apparatus receives, on selection icon, first user input corresponding to try-on experience of products, and obtains hand gestures from user in direction of products to be tried-on. Apparatus analyzes hand gestures to determine intent of hand gestures, and spawns translucently, avatar depicting hand gestures of user, in VR and/or AR experience in metaverse environment. Apparatus provides personalization option, experimentation option, and utility option through UI, for products accessorized on avatar. Apparatus receives second input from user corresponding to personalization/ experimentation/ utility option to perform associated action, and/ or hand gestures to terminate try-on experience.
Description:FIELD OF INVENTION
[0001] The embodiments of the present disclosure generally relate to Virtual Reality (VR)/Augmented Reality (AR) experiences in a metaverse environment. More particularly, the present disclosure relates to an apparatus and a method for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment.
BACKGROUND
[0002] The following description of the related art is intended to provide background information pertaining to the field of the disclosure. This section may include certain aspects of the art that may be related to various features of the present disclosure. However, it should be appreciated that this section be used only to enhance the understanding of the reader with respect to the present disclosure, and not as admissions of the prior art.
[0003] Generally, a virtual commerce application is an Information and communication technology (ICT) artifact for commercial activities, with a Mixed Reality (MR) providing immersive virtual environments that contribute to improvements over existing electronic commerce (e-commerce). For example, using a low-immersion Virtual Reality (VR) such as Virtual Try-On (VTO) devices, a product presentation in retailing may be more realistic and interactive compared to the text and image presentations in a web-based e-commerce. Further, high immersion VR can also bring a real-world shopping experience into the e-commerce. Meanwhile, an Augmented Reality (AR) can present products in a context of a user’s surrounding environment to enhance a product experience.
[0004] Currently, the users may be able to perform a virtual interactive experience with products through an avatar or watch the operation of products in a virtual environment. In one or more embodiments, a Three-Dimensional (3D) store within a 3D virtual commerce environment may be displayed via the AR or VR devices to present user-interactive virtual items and virtual checkout options in the e-commerce environment. Further, a motion measuring device may detect user motions and may modify the display of the 3D virtual store based on a selection of the user on the interactive virtual product for purchase. Further, a physical product corresponding to the user-selected virtual product may be purchased using the virtual checkout option in the e-commerce environment. Further, an avatar can also be customized, such as by trying on a new hairstyle, a wristband, or an article of clothing.
[0005] Conventionally, the methods including conventional VR/AR techniques may only be integrating virtual products in a virtual environment. However, there is no persistent method for universal hand gestures of users for interacting with products in a metaverse environment. Further, user inputs such as hand gestures or audio inputs for detecting user interaction with virtual objects are either less intuitive or under-utilized. Furthermore, the user may eventually expect advanced virtual experiences, especially in terms of rich customizations and advanced interactions with try-ons, and in turn User Interfaces (UIs) may grow. For a successful purchase of a product in the e-commerce environment, there needs to be support for real-time interactions and a successful transition from one stage of the purchase journey to another stage. In addition, for a majority of the users, traditional web-based applications and experiences may already fulfill the requirements. However, for users in need of bridging over to a metaverse commerce environment, there may be a strong expectation of aesthetic, game-like visual experiences, and the like. Also, displays and headsets may remain a barrier to the true immersion of the metaverse environment. There is a need to bridge between the digital and physical environments, with physical experiences that respond to human gestures, such as facial gestures, arm movements, and voice commands replacing the click of a mouse or the tap on a display. In the metaverse environment, the purchase decision can be impacted by factors, such as a lack of information, a good viewing experience of the try-on on the avatar, as well as a lack of choices, and interactions that could be useful to the user, such as social sharing or answering queries.
[0006] Therefore, there is a need for an apparatus and a method for solving the shortcomings of the conventional methods, by providing an apparatus and a method for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment.
SUMMARY
[0007] This section is provided to introduce certain objects and aspects of the present invention in a simplified form that are further described below in the detailed description. This summary is not intended to identify the key features or the scope of the claimed subject matter. In order to overcome at least a few problems associated with the known solutions as provided in the previous section, an object of the present disclosure is to provide a technique that may be for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experiences of a product in a metaverse environment.
[0008] It is an object of the present disclosure to provide an apparatus and a method for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment.
[0009] It is another object of the present disclosure to provide an apparatus and a method for allowing a universal hand gesture for interacting with electronic commerce (e-commerce) composable products in the metaverse environment. This helps to render user interactions in real-time in augmented and virtual reality (VR/AR) environments using the user gestures.
[0010] It is yet another object of the present disclosure to provide an apparatus and a method for enabling an avatar to mirror the gestures of the user and evolving the avatar with a digital product which is added as an accessory to the avatar. This helps the user to gauge the appearance of the digital product on the avatar and proceeds to the purchase stage.
[0011] It is yet another object of the present disclosure to provide an apparatus and a method for providing a User interface (UI) such as a selection icon, which helps the user to purchase stage such as customization of the digital artifact, besides reinforcing buying conviction by allowing the user to move to checkout or mint the artifact as an NFT. Besides these interactions with the user, the UI may also offer flexibility to share the digital item on social media or messenger or save the product for later visitation.
[0012] It is yet another object of the present disclosure to provide an apparatus and a method for executing, in the metaverse environment, the purchase decision by factors, such as an informative, a good viewing experience of the try-on on the avatar, choices, and interactions that could be useful to the user, such as social sharing or answering queries.
[0013] In an aspect, the present disclosure provides an apparatus for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment. The apparatus displays one or more composable objects corresponding to one or more products, to a user. The one or more composable objects are displayed with a selection icon. Further, the apparatus receives, on the selection icon, a first user input corresponding to a try-on experience of the one or more products. Furthermore, the apparatus obtains one or more hand gestures from the user in the direction of the one or more products to be tried-on, upon receiving the user input. Additionally, the apparatus analyzes the one or more hand gestures to determine an intent of the one or more hand gestures. Further, the apparatus spawns translucently, an avatar depicting the one or more hand gestures of the user, in at least one of a Virtual Reality (VR) and an Augmented Reality (AR) experience in a metaverse environment. The avatar is accessorized with one or more products to be tried on. Additionally, the apparatus provides at least one of a personalization option, an experimentation option, and a utility option through a User Interface (UI), for the one or more products accessorized on the avatar. Further, the apparatus receives a second input from the user corresponding to at least one of the at least one of personalization option, experimentation option, and utility option to perform an associated action, and the one or more hand gestures to terminate the try-on experience.
[0014] In an embodiment, for analyzing the one or more hand gestures, the apparatus extracts at least one of hand position data and hand depth data from a plurality of images corresponding to one or more hand gestures received by an associated image capturing unit. Further, the apparatus identifies, in real-time, user interactions and the one or more hand gestures with the at least one of a Virtual Reality (VR) and an Augmented Reality (AR) in a metaverse environment, from the extracted at least one of hand position data and hand depth data.
[0015] In an embodiment, at least one of the personalization option, the experimentation option, and the utility option includes at least one of a purchase, customization, finding artifacts that are similar to the try-on experiences, changing colors, trying on a different fit of the one or more products, finding similar products, and pairing with other products.
[0016] In an embodiment, the selection icon comprises a shopping cart appearing over the one or more products, and attention-grabbing information for the auto-spawning of the avatar.
[0017] In another embodiment, the selection icon appears for a predefined time, and subsequently terminates the selection icon after the predefined time is elapsed, if the user shows no intention of purchasing the one or more products.
[0018] In another embodiment, the at least one of a personalization option, an experimentation option, and a utility option include at least one of a checkout, mint the product as a Non-Fungible Token (NFT), save a look of try-on experience for purchasing the product later, changing color, style, the shape of the product, search for similar accessories as provided by a same brand or another brand in a vicinity, perform customizations, and share the product with friends on a social media.
[0019] In another aspect, the present disclosure provides a method for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment. The method includes displaying one or more composable objects corresponding to one or more products, to a user. The one or more composable objects are displayed with a selection icon. Further, the method includes receiving on the selection icon, a first user input corresponding to a try-on experience of the one or more products. Furthermore, the method includes obtaining one or more hand gestures from the user in the direction of the one or more products to be tried on, upon receiving the user input. Additionally, the method includes analyzing the one or more hand gestures to determine an intent of the one or more hand gestures. Further, the method includes spawning translucently an avatar depicting the one or more hand gestures of the user, in at least one of a Virtual Reality (VR) and an Augmented Reality (AR) experience in a metaverse environment. The avatar is accessorized with one or more products to be tried on. Furthermore, the method includes providing at least one of a personalization option, an experimentation option, and a utility option through a User Interface (UI), for the one or more products accessorized on the avatar. Additionally, the method includes receiving a second input from the user corresponding to at least one of the at least one of personalization option, experimentation option, and utility option to perform an associated action, and the one or more hand gestures to terminate the try-on experience.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
[0020] The accompanying drawings, which are incorporated herein, and constitute a part of this invention, illustrate exemplary embodiments of the disclosed methods and apparatus in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present invention. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry/sub-components of each component. It will be appreciated by those skilled in the art that the invention of such drawings includes the invention of electrical components, electronic components, or circuitry commonly used to implement such components.
[0021] FIG. 1 illustrates an exemplary block diagram representation of a network architecture implementing a proposed apparatus for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment, according to embodiments of the present disclosure.
[0022] FIG. 2 illustrates an exemplary detailed block diagram representation of the proposed apparatus, according to embodiments of the present disclosure.
[0023] FIGs. 3A, 3B, and 3C illustrate exemplary schematic diagram representations of one or more hand gestures, accessorizing of avatar, and one or more options on a User Interface (UI), respectively, according to embodiments of the present disclosure.
[0024] FIG. 4 illustrates a flow chart depicting a method of providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment, according to embodiments of the present disclosure.
[0025] FIG. 5 illustrates a hardware platform for the implementation of the disclosed apparatus according to embodiments of the present disclosure.
[0026] The foregoing shall be more apparent from the following more detailed description of the invention.
DETAILED DESCRIPTION OF INVENTION
[0027] In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not address all of the problems discussed above or might address only some of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein.
[0028] The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that, various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth.
[0029] Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
[0030] Also, it is noted that individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
[0031] The word “exemplary” and/or “demonstrative” is used herein to mean serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “exemplary” and/or “demonstrative” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques known to those of ordinary skill in the art. Furthermore, to the extent that the terms “includes,” “has,” “contains,” and other similar words are used in either the detailed description or the claims, such terms are intended to be inclusive - in a manner similar to the term “comprising” as an open transition word - without precluding any additional or other elements.
[0032] As used herein, "connect", "configure", "couple" and its cognate terms, such as "connects", "connected", "configured", and "coupled" may include a physical connection (such as a wired/wireless connection), a logical connection (such as through logical gates of a semiconducting device), other suitable connections, or a combination of such connections, as may be obvious to a skilled person.
[0033] As used herein, "send", "transfer", "transmit", and their cognate terms like "sending", "sent", "transferring", "transmitting", "transferred", "transmitted", etc. include sending or transporting data or information from one unit or component to another unit or component, wherein the content may or may not be modified before or after sending, transferring, transmitting.
[0034] Reference throughout this specification to “one embodiment” or “an embodiment” or “an instance” or “one instance” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0035] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed products.
[0036] Various embodiments of the present disclosure provide an apparatus and a method for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment. The present disclosure provides an apparatus and a method for allowing a universal hand gesture for interacting with electronic commerce (e-commerce) composable products in the metaverse environment. This helps to render user interactions in real-time in augmented and virtual reality (VR/AR) environments using the user gestures. The present disclosure provides an apparatus and a method for enabling an avatar to mirror the gestures of the user and evolving the avatar with a digital product which is added as an accessory to the avatar. This helps the user to gauge the appearance of the digital product on the avatar and proceeds to the purchase stage. The present disclosure provides an apparatus and a method for providing a User interface (UI) such as a selection icon, which helps the user to purchase stage such as customization of the digital artifact, besides reinforcing buying conviction by allowing the user to move to checkout or mint the artifact as an NFT. Besides these interactions with the user, the UI may also offer flexibility to share the digital item on social media or messenger or save the product for later visitation. The present disclosure provides an apparatus and a method for executing, in the metaverse environment, the purchase decision by factors, such as an informative, a good viewing experience of the try-on on the avatar, choices, and interactions that could be useful to the user, such as social sharing or answering queries.
[0037] FIG. 1 illustrates an exemplary block diagram representation of a network architecture 100 implementing a proposed apparatus 108 for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment. The network architecture 100 may include the apparatus 108-1, 108-2, ….108-N (individually/collectively referred to as the apparatus 108), and a centralized server 116. The apparatus 108 may be connected to the centralized server 116 via a communication network 106. The centralized server 116 may include, but is not limited to, a stand-alone server, a remote server, a cloud computing server, a dedicated server, a rack server, a server blade, a server rack, a bank of servers, a server farm, hardware supporting a part of a cloud service or system, a home server, hardware running a virtualized server, one or more processors executing code to function as a server, one or more machines performing server-side functionality as described herein, at least a portion of any of the above, some combination thereof, and the like. The centralized server 116 may be associated with an entity corresponding to an electronic commerce (e-commerce) environment and/or metaverse environment. The communication network 106 may be a wired communication network or a wireless communication network. The wireless communication network may be any wireless communication network capable of transferring data between entities of that network such as, but is not limited to, a Bluetooth, a Zigbee, a Near Field Communication (NFC), a Wireless-Fidelity (Wi-Fi), a Light Fidelity (Li-FI), a carrier network including a circuit-switched network, a public switched network, a Content Delivery Network (CDN) network, a Long-Term Evolution (LTE) network, a New Radio (NR), a Narrow-Band (NB), an Internet of Things (IoT) network, a Global System for Mobile Communications (GSM) network and a Universal Mobile Telecommunications System (UMTS) network, an Internet, intranets, Local Area Networks (LANs), Wide Area Networks (WANs), mobile communication networks, combinations thereof, and the like.
[0038] The apparatus 108 may be implemented by way of a single device or a combination of multiple devices that may be operatively connected or networked together. For example, the apparatus 108 may be implemented by way of a standalone device (i.e., as the apparatus 108). In another example, the apparatus 108 may be associated with the centralized server 116. In yet another example, the apparatus 108 may be implemented in/ associated with a respective computing device 104-1, 104-2, …..., 104-N (individually referred to as the computing device 104, and collectively referred to as the computing devices 104), associated with one or more user 102-1, 102-2, …..., 102-N (individually referred to as the user 102, and collectively referred to as the users 102). In such a scenario, the apparatus 108 may be implemented in/associated with each of the computing devices 104. The users 102 may be a user of, but are not limited to, an electronic commerce (e-commerce) platform, a hyperlocal platform, a super-mart platform, a media platform, a service providing platform, a social networking platform, a travel/services booking platform, a messaging platform, a bot processing platform, a virtual assistance platform, an Artificial Intelligence (AI) based platform, a blockchain platform, a blockchain marketplace, and the like. In some instances, the user 102 may correspond to an entity/administrator of platforms/services.
[0039] The apparatus 108 and the computing devices 104 may be at least one of, but not limited to, an electrical, an electronic, an electromechanical, a computing device, and the like. The apparatus 108 and the computing devices 104 may include, but are not limited to, a mobile device, a smart-phone, a Personal Digital Assistant (PDA), a tablet computer, a phablet computer, a wearable computing device, a Virtual Reality/Augmented Reality (VR/AR) device, a headset, a laptop, a desktop, a server, and the like. The apparatus 108 may be implemented in hardware or a suitable combination of hardware and software. The apparatus 108 or the centralized server 116 may be associated with entities (not shown). The entities may include, but are not limited to, an e-commerce company, a travel company, an airline company, a hotel booking company, a company, an outlet, a manufacturing unit, an enterprise, a facility, an organization, an educational institution, a secured facility, a warehouse facility, a supply chain facility, and the like.
[0040] Further, the apparatus 108 may include a processor 110, an Input/Output (I/O) interface 112, and a memory 114. The Input/Output (I/O) interface 112 of the apparatus 108 may be used to receive user inputs, from the computing devices 104 associated with the users 102. Further, the apparatus 108 may also include other units such as a display unit, an input unit, an output unit, and the like, however the same are not shown in FIG. 1, for the purpose of clarity. Also, in FIG. 1 only a few units are shown, however, the apparatus 108 or the network architecture 100 may include multiple such units or the apparatus 108 / network architecture 100 may include any such numbers of the units, obvious to a person skilled in the art or as required to implement the features of the present disclosure. The apparatus 108 may be a hardware device including the processor 110 executing machine-readable program instructions to provide at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment.
[0041] Execution of the machine-readable program instructions by the processor 110 may enable the proposed apparatus 100 to provide at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment. The “hardware” may comprise a combination of discrete components, an integrated circuit, an application-specific integrated circuit, a field-programmable gate array, a digital signal processor, or other suitable hardware. The “software” may comprise one or more objects, agents, threads, lines of code, subroutines, separate software applications, two or more lines of code, or other suitable software structures operating in one or more software applications or on one or more processors. The processor 110 may include, for example, but is not limited to, microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuits, and any devices that manipulate data or signals based on operational instructions, and the like. Among other capabilities, the processor 110 may fetch and execute computer-readable instructions in the memory 114 operationally coupled with the apparatus 108 for performing tasks such as data processing, input/output processing, feature extraction, and/or any other functions. Any reference to a task in the present disclosure may refer to an operation being or that may be performed on data.
[0042] In the example that follows, assume that a user 102 of the apparatus 108 desires to improve/add additional features to provide at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experiences of a product in a metaverse environment. In this instance, the user 102 may include an administrator of a website, an administrator of an e-commerce site, an administrator of a social media site, an administrator of an e-commerce application/ social media application/other applications, an administrator of media content (e.g., television content, video-on-demand content, online video content, graphical content, image content, augmented/virtual reality content, metaverse content), an administrator of supply chain platform, an administrator of blockchain marketplace, an administrator of a travel/services booking platform, among other examples, and the like. In addition, the apparatus 108 may include, but is not limited to, a touch panel, a soft keypad, a hard keypad (including buttons), and the like.
[0043] In an embodiment, the apparatus 110 may display one or more composable objects corresponding to one or more products, to a user 102. In an embodiment, the one or more composable objects are displayed with a selection icon.
[0044] In an embodiment, the apparatus 110 may receive, on the selection icon, a first user input corresponding to a try-on experience of the one or more products. In an embodiment, the selection icon appears for a predefined time, and subsequently terminates the selection icon after the predefined time is elapsed, if the user shows no intention of purchasing the one or more products. In an embodiment, the selection icon includes, but is not limited to, a shopping cart appearing over the one or more products, attention-grabbing information for auto-spawning of the avatar, and the like.
[0045] In an embodiment, the apparatus 110 may obtain one or more hand gestures from the user 102 in the direction of the one or more products to be tried-on, upon receiving the user input.
[0046] In an embodiment, the apparatus 110 may analyze the one or more hand gestures to determine an intent of the one or more hand gestures. In an embodiment, for analyzing the one or more hand gestures, the apparatus 108 may extract at least one of hand position data and hand depth data from a plurality of images corresponding to one or more hand gestures received by an associated image capturing unit. In an embodiment, the apparatus 110 may identify, in real-time, user interactions and the one or more hand gestures with the at least one of a Virtual Reality (VR) and an Augmented Reality (AR) in a metaverse environment, from the extracted at least one of hand position data and hand depth data.
[0047] In an embodiment, the apparatus 110 may spawn translucently, an avatar depicting the one or more hand gestures of the user 102, in at least one of a Virtual Reality (VR) and an Augmented Reality (AR) experience in a metaverse environment. In an embodiment, the avatar may be accessorized with one or more products to be tried on.
[0048] In an embodiment, the apparatus 110 may provide at least one of a personalization option, an experimentation option, and a utility option through a User Interface (UI), for the one or more products accessorized on the avatar. In an embodiment, the at least one of the personalization option, the experimentation option, and the utility option include, but are not limited to, checkout, mint the product as a Non-Fungible Token (NFT), save a look of try-on experience for purchasing the product later, changing color, style, the shape of the product, search for similar accessories as provided by a same brand or another brand in a vicinity, perform customizations, share the product with friends on a social media, and the like. In an embodiment, the at least one of the personalization option, the experimentation option, and the utility option include, but are not limited to, at least one of a purchase, customization, finding artifacts that are similar to the try-on experiences, changing colors, trying on a different fit of the one or more products, finding similar products, pairing with other products, and the like.
[0049] In an embodiment, the apparatus 110 may receive a second input from the user corresponding to at least one of the at least one of personalization option, experimentation option, and utility option to perform an associated action, and the one or more hand gestures to terminate the try-on experience.
[0050] FIG. 2 illustrates an exemplary detailed block diagram representation of the proposed apparatus 108, according to embodiments of the present disclosure. The apparatus 108 may include the processor 110, the Input/Output (I/O) interface 112, and the memory 114. In some implementations, the apparatus 108 may include data 202, and modules 204. As an example, the data 202 may be stored in the memory 114 configured in the apparatus 108 as shown in FIG. 2.
[0051] In an embodiment, the data 202 may include composable object data 206, selection icon data 208, hand gestures data 210, intent data 212, avatar data 214, options data 216, and other data 218. In an embodiment, the data 202 may be stored in the memory 114 in the form of various data structures. Additionally, the data 202 can be organized using data models, such as relational or hierarchical data models. The other data 218 may store data, including temporary data and temporary files, generated by the modules 204 for performing the various functions of the apparatus 108.
[0052] In an embodiment, the modules 204, may include a displaying module 222, a receiving module 224, an obtaining module 226, an analyzing module 228, a spawning module 230, a providing module 232, and other modules 228.
[0053] In an embodiment, the data 202 stored in the memory 114 may be processed by the modules 204 of the apparatus 108. The modules 204 may be stored within the memory 114. In an example, the modules 204 communicatively coupled to the processor 110 configured in the apparatus 108, may also be present outside the memory 114, as shown in FIG. 2, and implemented as hardware. As used herein, the term modules refer to an Application-Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
[0054] In an embodiment, the displaying module 222 may display one or more composable objects corresponding to one or more products, to a user 102. In an embodiment, the one or more composable objects may be displayed with a selection icon. The displayed one or more composable objects may be stored as the composable object data 206.
[0055] In an embodiment, the receiving module 224 may receive, on the selection icon, a first user input corresponding to a try-on experience of the one or more products. In an embodiment, the selection icon appears for a predefined time, and subsequently terminates the selection icon after the predefined time is elapsed, if the user shows no intention of purchasing the one or more products. In an embodiment, the selection icon includes, but is not limited to, a shopping cart appearing over the one or more products, attention-grabbing information for auto-spawning of the avatar, and the like. The first user input on the selection icon may be stored as the selection icon data 208.
[0056] In an embodiment, the obtaining module 226 may obtain one or more hand gestures from the user 102 in the direction of the one or more products to be tried-on, upon receiving the user input. The obtained one or more hand gestures from the user 102 may be stored as the hand gestures data 210.
[0057] In an embodiment, the analyzing module 228 may analyze the one or more hand gestures to determine an intent of the one or more hand gestures. The determined intent of the one or more hand gestures may be stored as the intent data 212. In an embodiment, for analyzing the one or more hand gestures, the apparatus 108 may extract at least one of hand position data and hand depth data from a plurality of images corresponding to one or more hand gestures received by an associated image capturing unit. In an embodiment, the apparatus 110 may identify, in real-time, user interactions and the one or more hand gestures with the at least one of a Virtual Reality (VR) and an Augmented Reality (AR) in a metaverse environment, from the extracted at least one of hand position data and hand depth data.
[0058] In an embodiment, the spawning module 230 may spawn translucently, an avatar depicting the one or more hand gestures of the user 102, in at least one of a Virtual Reality (VR) and an Augmented Reality (AR) experience in a metaverse environment. In an embodiment, the avatar may be accessorized with one or more products to be tried on. The avatar spawned translucently depicting the one or more hand gestures may be stored as the avatar data 214.
[0059] In an embodiment, the providing module 232 may provide at least one of a personalization option, an experimentation option, and a utility option through a User Interface (UI), for the one or more products accessorized on the avatar. The provided at least one of the personalization options, the experimentation option, and the utility option may be stored as the options data 216. In an embodiment, the at least one of the personalization option, the experimentation option, and the utility option include, but are not limited to, checkout, mint the product as a Non-Fungible Token (NFT), save a look of try-on experience for purchasing the product later, changing color, style, the shape of the product, search for similar accessories as provided by a same brand or another brand in a vicinity, perform customizations, share the product with friends on a social media, and the like. In an embodiment, the at least one of the personalization option, the experimentation option, and the utility option include, but are not limited to, at least one of a purchase, customization, finding artifacts that are similar to the try-on experiences, changing colors, trying on a different fit of the one or more products, finding similar products, pairing with other products, and the like.
[0060] In an embodiment, the receiving module 224 may receive a second input from the user corresponding to at least one of the at least one of personalization option, experimentation option, and utility option to perform an associated action, and the one or more hand gestures to terminate the try-on experience.
Exemplary scenario 1:
[0061] Consider, for example, that user 102 may be using a Virtual Reality (VR) device and/or an Augmented Reality (AR) device (i.e., apparatus 108), among others. The VR/AR device may create a Three-Dimensional (3D) virtual e-commerce environment for the user 102, to navigate through the products. The 3D virtual e-commerce environment may be composed of objects or products for purchase along with information for marketing a product. For example, a selection icon such as a shopping cart appearing over, for example, a pair of boots, a dress, a wig, and even a mask, and some attention-grabbing information which may be auto-spawning near the object such as, for example, but are not limited to, a 50% off coupon, a secret VIP event ticket, a free gift offer, and the like. The selection icons could appear on the UI and then auto-deleted after a few seconds or minutes, if the user 102 shows no intention of purchasing the product. Further, a commerce composability for such selection icons may allow the user 102 to enter into the try-on experience, where the user 102 can wear or try on a product to see if it matches their expectations.
Exemplary scenario 2:
[0062] Consider, for example, the user 102 may perform one or more hand gestures to invoke AR/VR try-on experience in a meta verse environment, as shown in FIG. 3A. The hand gesture may be used by the apparatus 108 to integrate a universal and intuitive hand gesture for the user 102 to slip into the try-on experience of one or more products, as shown in FIG. 3A. For example, while the user 102 traversing through a virtual environment, comes across a commerce-composable product, i.e., a product aimed for commerce and available as a try-on experience. The user 102 knows that it is a try-on because it is presented as an icon to the user 102. The user 102, with the intent of entering into the try-on experience, makes the hand gesture (e.g., enchantress hand gesture) and raises the hand vertically in front of the AR/VR device (e.g., headset). Upon the hand gesture, and once the AR/VR recognizes the hand gesture, the avatar spawns translucently in front of the user 102 in the display of the AR/VR device, and replicates the same gesture as the user. The apparatus 108 may upgrade the avatar and accessorizes with the virtual try-on and the UI may appear next to the avatar with different handy action interactions for the user 102 to make. In an example, when a commerce-composable object is detected, the apparatus 108 may display to the user 102 with a selection icon which stands out from other non-commercial fixtures in the virtual environment. The selection icon may also be a representation of a gateway into the try-on experience.
[0063] In another example, the user 102 holds up their hand vertically in front of the AR VR device in the direction of the product the users 102 may need to try on. The AR/VR device may recognize the gesture as, for example, an “enchantress hand gesture”. The AR/VR device may dynamically spawn a user’s avatar translucently in front of the user 102, as shown in FIG. 3B, without the user needing to rotate their own perspective by 180 degrees. Th object may be shown in the virtual environment or in the real environment as a VR/AR object. The avatar now mirrors the same gesture as the user 102, and immediately the product may be accessorized on the avatar for the user to view, as shown in FIG. 3B. For example, the avatar mirrors the same gesture as the user 102, such as a “high-five”. The UI may include a range of interactions and actions available for the user 102. Within the try-on experience, the avatar mirrors the gesture made by the user 102 and interacts directly with the wearable product by virtue of changing into an evolved form with the digital product added as an accessory to the avatar. The user 102 can gauge the appearance of the product on the avatar and proceed to the purchase stage. Based on the user’s perception of the try-on, the user 102 may choose to buy the try-on as, but not limited to, a NFT, save the look for purchase later, customize the look, and in this instance the virtual try-on by changing its color, style, shape, and the like, or find other similar items for try-on. Further, an advanced set of interactions can also be made available, such as social sharing, querying, and much more to support the user 102 fully throughout the try-on experience.
[0064] In addition, the user Interface (UI) may then pop up in a display of the VR/AR device, next to the avatar, which awaits further interaction, as shown in FIG. 3C. Further, the UI may reinforce buying conviction in the user 102 with regards to both purpose fit and the associated value of the metaverse experienceable product. Depending upon the readiness of user 102 to experiment further with the accessory/product, the user 102 can, at any point in time, do the experiential experiment on the product, such as, but not limited to, changing colors, trying on a different fit, finding similar items, or pairing other items, depending on the flexibilities of customization afforded by the brand, and the like. The ability to utilize additional personalization options, experimentation, and utility may be enabled through the UI. The UI, in basic form, may allow the user 102 to but are not limited to, proceed to checkout, mint the product as a Non-Fungible Token (NFT), save the look for purchase later, search for similar accessories as provided by the same brand or others in the vicinity, do customizations, share the product with friends on social media so that they too may try on the experience, and the like.
[0065] Also, the UI may provide a NFT instance of the product as well as a method to optimize NFT generation costs by introducing ‘mint NFT now’ step in the check-out process with an electronic wallet (e-wallet) value optimization for the user 102. For example, after deliberation, if the user 102 does not want to take any action, the user 102 can make the same gesture of holding the hand vertically in front of the AR/VR device, and the user 102 may be able to come out of the try-on experience, back to where the user 102 had been before getting into the try-on experience. For example, the UI may help the user 102 with interactions crucial to the purchase stage such as customization of the digital artifact, besides reinforcing buying conviction by allowing the user to move to checkout or mint the artifact as the NFT. Besides these interactions, the UI also offers the flexibility to share the digital item with, for example, friends or save the product for visitation later.
[0066] FIG. 4 illustrates a flow chart depicting a method 400 of providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment, according to embodiments of the present disclosure.
[0067] At block 402, the method 400 includes displaying, by the processor 110 associated with the apparatus 108, one or more composable objects corresponding to one or more products, to the user 102. The one or more composable objects are displayed with a selection icon.
[0068] At block 404, the method 400 includes receiving, by the processor 110, on the selection icon, a first user input corresponding to a try-on experience of the one or more products.
[0069] At block 406, the method 400 includes obtaining, by the processor 110, one or more hand gestures from the user 102 in the direction of the one or more products to be tried-on, upon receiving the user input.
[0070] At block 408, the method 400 includes analyzing, by the processor 110, the one or more hand gestures to determine an intent of the one or more hand gestures.
[0071] At block 410, the method 400 includes spawning translucently, by the processor 110, an avatar depicting the one or more hand gestures of the user 102, in at least one of a Virtual Reality (VR) and an Augmented Reality (AR) experience in a metaverse environment. The avatar is accessorized with one or more products to be tried on.
[0072] At block 412, the method 400 includes providing, by the processor 110, at least one of a personalization option, an experimentation option, and a utility option through a User Interface (UI), for the one or more products accessorized on the avatar.
[0073] At block 414, the method 400 includes receiving, by the processor 110, a second input from the user 102 corresponding to at least one of the at least one of personalization option, experimentation option, and utility options to perform an associated action, and the one or more hand gestures to terminate the try-on experience.
[0074] The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined or otherwise performed in any order to implement the method 400 or an alternate method. Additionally, individual blocks may be deleted from the method 400 without departing from the spirit and scope of the present disclosure described herein. Furthermore, the method 400 may be implemented in any suitable hardware, software, firmware, or a combination thereof, that exists in the related art or that is later developed. The method 400 describes, without limitation, the implementation of the apparatus 108. A person of skill in the art will understand that method 400 may be modified appropriately for implementation in various manners without departing from the scope and spirit of the disclosure.
[0075] FIG. 5 illustrates a hardware platform 500 for implementation of the disclosed apparatus 108, according to an example embodiment of the present disclosure. For the sake of brevity, the construction, and operational features of the apparatus 108 which are explained in detail above are not explained in detail herein. Particularly, computing machines such as but not limited to internal/external server clusters, quantum computers, desktops, laptops, smartphones, tablets, and wearables which may be used to execute the apparatus 108 or may include the structure of the hardware platform 500. As illustrated, the hardware platform 500 may include additional components not shown, and that some of the components described may be removed and/or modified. For example, a computer system with multiple GPUs may be located on external-cloud platforms including Amazon® Web Services, or internal corporate cloud computing clusters, or organizational computing resources, and the like.
[0076] The hardware platform 500 may be a computer system such as the apparatus 108 that may be used with the embodiments described herein. The computer system may represent a computational platform that includes components that may be in a server or another computer system. The computer system may execute, by the processor 505 (e.g., a single or multiple processors) or other hardware processing circuit, the methods, functions, and other processes described herein. These methods, functions, and other processes may be embodied as machine-readable instructions stored on a computer-readable medium, which may be non-transitory, such as hardware storage devices (e.g., RAM (random access memory), ROM (read-only memory), EPROM (erasable, programmable ROM), EEPROM (electrically erasable, programmable ROM), hard drives, and flash memory). The computer system may include the processor 505 that executes software instructions or code stored on a non-transitory computer-readable storage medium 510 to perform methods of the present disclosure. The software code includes, for example, instructions to gather data and documents and analyze documents. In an example, the modules 204, maybe software codes or components performing these steps.
[0077] The instructions on the computer-readable storage medium 510 are read and stored the instructions in storage 515 or in random access memory (RAM). The storage 515 may provide a space for keeping static data where at least some instructions could be stored for later execution. The stored instructions may be further compiled to generate other representations of the instructions and dynamically stored in the RAM such as RAM 520. The processor 505 may read instructions from the RAM 520 and perform actions as instructed.
[0078] The computer system may further include the output device 525 to provide at least some of the results of the execution as output including, but not limited to, visual information to users, such as external agents. The output device 525 may include a display on computing devices and virtual reality glasses. For example, the display may be a mobile phone screen or a laptop screen. GUIs and/or text may be presented as an output on the display screen. The computer system may further include an input device 530 to provide a user or another device with mechanisms for entering data and/or otherwise interacting with the computer system. The input device 530 may include, for example, a keyboard, a keypad, a mouse, or a touchscreen. Each of these output devices 525 and input device 530 may be joined by one or more additional peripherals. For example, the output device 525 may be used to display the results such as bot responses by the executable chatbot.
[0079] A network communicator 535 may be provided to connect the computer system to a network and in turn to other devices connected to the network including other clients, servers, data stores, and interfaces, for instance. A network communicator 535 may include, for example, a network adapter such as a LAN adapter or a wireless adapter. The computer system may include a data sources interface 540 to access the data source 545. The data source 545 may be an information resource. As an example, a database of exceptions and rules may be provided as the data source 545. Moreover, knowledge repositories and curated data may be other examples of the data source 545.
[0080] While considerable emphasis has been placed herein on the preferred embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the invention. These and other changes in the preferred embodiments of the invention will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be implemented merely as illustrative of the invention and not as a limitation.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0081] The present disclosure provides an apparatus and a method for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment.
[0082] The present disclosure provides an apparatus and a method for allowing a universal hand gesture for interacting with electronic commerce (e-commerce) composable products in the metaverse environment. This helps to render user interactions in real-time in augmented and virtual reality (VR/AR) environments using the user gestures.
[0083] The present disclosure provides an apparatus and a method for enabling an avatar to mirror the gestures of the user and evolving the avatar with a digital product which is added as an accessory to the avatar. This helps the user to gauge the appearance of the digital product on the avatar and proceeds to the purchase stage.
[0084] The present disclosure provides an apparatus and a method for providing a User interface (UI) such as a selection icon, which helps the user to purchase stage such as customization of the digital artifact, besides reinforcing buying conviction by allowing the user to move to checkout or mint the artifact as an NFT. Besides these interactions with the user, the UI may also offer flexibility to share the digital item on social media or messenger or save the product for later visitation.
[0085] The present disclosure provides an apparatus and a method for executing, in the metaverse environment, the purchase decision by factors, such as an informative, a good viewing experience of the try-on on the avatar, choices, and interactions that could be useful to the user, such as social sharing or answering queries.
, Claims:1. An apparatus (108) for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experiences of a product in a metaverse environment, the apparatus (108) comprising:
a processor (110); and
a memory (114) coupled to the processor (110), wherein the memory (114) comprises processor-executable instructions, which on execution, cause the processor (110) to:
display one or more composable objects corresponding to one or more products, to a user (102), wherein the one or more composable objects is displayed with a selection icon;
receive, on the selection icon, a first user input corresponding to a try-on experience of the one or more products;
obtain one or more hand gestures from the user (102) in the direction of the one or more products to be tried-on, upon receiving the user input;
analyze the one or more hand gestures to determine an intent of the one or more hand gestures;
spawn translucently, an avatar depicting the one or more hand gestures of the user (102), in at least one of a Virtual Reality (VR) and an Augmented Reality (AR) experience in a metaverse environment, wherein the avatar is accessorized with one or more products to be tried-on;
provide at least one of a personalization option, an experimentation option, and a utility option through a User Interface (UI), for the one or more products accessorized on the avatar; and
receive a second input from the user (102) corresponding to at least one of the at least one of personalization option, experimentation option, and utility option to perform an associated action, and the one or more hand gestures to terminate the try-on experience.
2. The apparatus (108) as claimed in claim 1, wherein for analyzing the one or more hand gestures, the processor (110) is further configured to:
extract at least one of hand position data and hand depth data from a plurality of images corresponding to one or more hand gestures received by an associated image capturing unit; and
identify, in real-time, user interactions and the one or more hand gestures with the at least one of a Virtual Reality (VR) and an Augmented Reality (AR) in a metaverse environment, from the extracted at least one of hand position data and hand depth data.
3. The apparatus (108) as claimed in claim 1, wherein at least one of the personalization option, the experimentation option, and the utility option comprises at least one of a purchase, customization, finding artifacts that are similar to the try-on experiences, changing colors, trying on a different fit of the one or more products, finding similar products, and pairing with other products.
4. The apparatus (108) as claimed in claim 1, wherein the selection icon comprises a shopping cart appearing over the one or more products, and attention-grabbing information for auto-spawning of the avatar.
5. The apparatus (108) as claimed in claim 1, wherein the selection icon appears for a predefined time, and subsequently terminates the selection icon after the predefined time is elapsed, if the user (102) shows no intention of purchasing the one or more products.
6. The apparatus (108) as claimed in claim 1, wherein the at least one of a personalization option, an experimentation option, and a utility option comprises at least one of a checkout, mint the product as a Non-Fungible Token (NFT), save a look of try-on experience for purchasing the product later, changing color, style, shape of the product, search for similar accessories as provided by a same brand or another brand in a vicinity, perform customizations, and share the product with friends on a social media.
7. A method for providing at least one of a Virtual Reality (VR) and an Augmented Reality (AR) try-on experience of a product in a metaverse environment, the method comprising:
displaying, by a processor (110) associated with an apparatus (108), one or more composable objects corresponding to one or more products, to a user (102), wherein the one or more composable objects are displayed with a selection icon;
receiving, by the processor (110), on the selection icon, a first user input corresponding to a try-on experience of the one or more products;
obtaining, by the processor (110), one or more hand gestures from the user (102) in the direction of the one or more products to be tried-on, upon receiving the user input;
analysing, by the processor (110), the one or more hand gestures to determine an intent of the one or more hand gestures;
spawning translucently, by the processor (110), an avatar depicting the one or more hand gestures of the user (102), in at least one of a Virtual Reality (VR) and an Augmented Reality (AR) experience in a metaverse environment, wherein the avatar is accessorized with one or more products to be tried-on;
providing, by the processor (110), at least one of a personalization option, an experimentation option, and a utility option through a User Interface (UI), for the one or more products accessorized on the avatar; and
receiving, by the processor (110), a second input from the user (102) corresponding to at least one of the at least one of personalization option, experimentation option, and utility option to perform an associated action, and the one or more hand gestures to terminate the try-on experience.
8. The method as claimed in claim 7, wherein analyzing the one or more hand gestures further comprises:
extracting, by the processor (110), at least one of hand position data and hand depth data from a plurality of images corresponding to one or more hand gestures received by an associated image capturing unit; and
identifying, by the processor (110), in real-time, user interactions and the one or more hand gestures with the at least one of a Virtual Reality (VR) and an Augmented Reality (AR) in a metaverse environment, from the extracted at least one of hand position data and hand depth data.
9. The method as claimed in claim 7, wherein at least one of the personalization option, the experimentation option, and the utility option comprises at least one of a purchase, customization, finding artifacts that are similar to the try-on experiences, changing colors, trying on a different fit of the one or more products, finding similar products, and pairing with other products.
10. The method as claimed in claim 7, wherein the selection icon comprises a shopping cart appearing over the one or more products, and attention-grabbing information for auto-spawning of the avatar.
11. The method as claimed in claim 7, wherein the selection icon appears for a predefined time, and subsequently terminates the selection icon after the predefined time is elapsed, if the user (102) shows no intention of purchasing the one or more products.
12. The method as claimed in claim 7, wherein the at least one of a personalization option, an experimentation option, and a utility option comprises at least one of a checkout, mint the product as a Non-Fungible Token (NFT), save a look of try-on experience for purchasing the product later, changing color, style, shape of the product, search for similar accessories as provided by a same brand or another brand in a vicinity, perform customizations, and share the product with friends on a social media.
| # | Name | Date |
|---|---|---|
| 1 | 202241052118-CLAIMS [03-07-2023(online)].pdf | 2023-07-03 |
| 1 | 202241052118-STATEMENT OF UNDERTAKING (FORM 3) [13-09-2022(online)].pdf | 2022-09-13 |
| 2 | 202241052118-REQUEST FOR EXAMINATION (FORM-18) [13-09-2022(online)].pdf | 2022-09-13 |
| 2 | 202241052118-CORRESPONDENCE [03-07-2023(online)].pdf | 2023-07-03 |
| 3 | 202241052118-REQUEST FOR EARLY PUBLICATION(FORM-9) [13-09-2022(online)].pdf | 2022-09-13 |
| 3 | 202241052118-FER_SER_REPLY [03-07-2023(online)].pdf | 2023-07-03 |
| 4 | 202241052118-POWER OF AUTHORITY [13-09-2022(online)].pdf | 2022-09-13 |
| 4 | 202241052118-FER.pdf | 2023-01-03 |
| 5 | 202241052118-FORM-9 [13-09-2022(online)].pdf | 2022-09-13 |
| 5 | 202241052118-ENDORSEMENT BY INVENTORS [26-09-2022(online)].pdf | 2022-09-26 |
| 6 | 202241052118-FORM 18 [13-09-2022(online)].pdf | 2022-09-13 |
| 6 | 202241052118-COMPLETE SPECIFICATION [13-09-2022(online)].pdf | 2022-09-13 |
| 7 | 202241052118-FORM 1 [13-09-2022(online)].pdf | 2022-09-13 |
| 7 | 202241052118-DECLARATION OF INVENTORSHIP (FORM 5) [13-09-2022(online)].pdf | 2022-09-13 |
| 8 | 202241052118-DRAWINGS [13-09-2022(online)].pdf | 2022-09-13 |
| 9 | 202241052118-FORM 1 [13-09-2022(online)].pdf | 2022-09-13 |
| 9 | 202241052118-DECLARATION OF INVENTORSHIP (FORM 5) [13-09-2022(online)].pdf | 2022-09-13 |
| 10 | 202241052118-COMPLETE SPECIFICATION [13-09-2022(online)].pdf | 2022-09-13 |
| 10 | 202241052118-FORM 18 [13-09-2022(online)].pdf | 2022-09-13 |
| 11 | 202241052118-FORM-9 [13-09-2022(online)].pdf | 2022-09-13 |
| 11 | 202241052118-ENDORSEMENT BY INVENTORS [26-09-2022(online)].pdf | 2022-09-26 |
| 12 | 202241052118-POWER OF AUTHORITY [13-09-2022(online)].pdf | 2022-09-13 |
| 12 | 202241052118-FER.pdf | 2023-01-03 |
| 13 | 202241052118-REQUEST FOR EARLY PUBLICATION(FORM-9) [13-09-2022(online)].pdf | 2022-09-13 |
| 13 | 202241052118-FER_SER_REPLY [03-07-2023(online)].pdf | 2023-07-03 |
| 14 | 202241052118-REQUEST FOR EXAMINATION (FORM-18) [13-09-2022(online)].pdf | 2022-09-13 |
| 14 | 202241052118-CORRESPONDENCE [03-07-2023(online)].pdf | 2023-07-03 |
| 15 | 202241052118-STATEMENT OF UNDERTAKING (FORM 3) [13-09-2022(online)].pdf | 2022-09-13 |
| 15 | 202241052118-CLAIMS [03-07-2023(online)].pdf | 2023-07-03 |
| 16 | 202241052118-US(14)-HearingNotice-(HearingDate-17-12-2025).pdf | 2025-11-21 |
| 1 | SearchStrategyamendedAE_25-06-2024.pdf |
| 1 | SearchStrategyE_03-01-2023.pdf |
| 2 | SearchStrategyamendedAE_25-06-2024.pdf |
| 2 | SearchStrategyE_03-01-2023.pdf |