Sign In to Follow Application
View All Documents & Correspondence

Virtual Reality Control

Abstract: Disclosed is a method and system for enabling a user to remotely perform an action in a real world via a virtual reality environment. The method comprises receiving user data associated with a user in a virtual reality environment and a real-time video stream associated with a store in a real world, and initiating multimedia on a device providing the virtual reality environment to the user. The method further comprise assigning one or more robots in the store to the user in virtual reality environment and generating a set of primary instructions, based on an action of the user in the virtual environment. The method furthermore comprises controlling the one or more robots based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
15 April 2016
Publication Number
20/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2024-01-27
Renewal Date

Applicants

HCL Technologies Limited
B-39, Sector 1, Noida 201 301, Uttar Pradesh, India

Inventors

1. TAMMANA, Sankar Uma
HCL Technologies Limited, A-8 & 9, Sec-60, Noida, UP-201301, India
2. DHALIWAL, Jasbir Singh
HCL Technologies Limited, A-8 & 9, Sec-60, Noida, UP-201301, India

Specification

TECHNICAL FIELD
[001] The present subject matter described herein, in general, relates to a system and a method for enabling a user to remotely perform an action, and more particularly a system and a method for enabling a user to remotely perform an action in a real world via a virtual reality environment.
BACKGROUND
[002] Technologies such as Virtual Reality (VR) are gaining considerable momentum in today’s world, because of immersive real life experience provided by these technologies. Further, such immersive real life experience provided by these technologies, greatly enhance user’s experience by reducing the difference between reality and virtual reality.
[003] E-commerce (electronic commerce or EC) is the buying and selling of goods and services, or the transmitting of funds or data, over an electronic network, primarily the Internet. These business transactions may occur in a business-to-business, business-to-consumer, consumer-to-consumer or consumer-to-business. Current the online purchasing i.e. e-commerce does not have the real immersive experience and lacks engagement.
[004] Conventional methodologies, available for online purchase provide an experience of buying the goods from a warehouse thus lacking real world immersive experience. Furthermore, the conventional methodologies fail to resolve conflicts arising between a virtual user and a real world user.
SUMMARY
[005] Before the present systems and methods for enabling a user to remotely perform an action in a real world via a virtual reality environment, are described, it is to be understood that this application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosures. It is also to be understood that the terminology used in the description is for the purpose of describing the particular implementations or versions or embodiments only, and is not intended to limit the scope of the present application. This summary is provided to introduce aspects related to a system and a method for enabling a user to remotely perform an action in a real world via a virtual reality environment. This summary is not intended to
3
identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[006] In one implementation, a system for enabling a user to remotely perform an action in a real world via a virtual reality environment is disclosed. In one aspect, the system comprises a memory and a processor coupled to the memory. Further, the processor may be capable of executing instructions in the memory to perform one or more steps. In the aspect, the system may receive user data associated with a user in a virtual reality environment and a real-time video stream associated with a store in a real world. Upon receiving, the system may initiate multimedia associated with the store on a device providing the virtual reality environment to the user. The multimedia may be one of the real-time video streams and a virtual rendering of the real-time video stream. Further to initiating, the system may assign a robot in the store to the user in virtual reality environment, based on the user data and a user location in the multimedia and generate a set of primary instructions, based on an action of the user in the virtual environment. Subsequent to generating, the system may control the one or more robots based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment.
[007] In one implementation, a method for enabling a user to remotely perform an action in a real world via a virtual reality environment is disclosed. In one aspect, the method may comprise receiving user data associated with a user in a virtual reality environment and a real-time video stream associated with a store in a real world. Upon receiving, the method may further comprise initiating multimedia associated with the store on a device providing the virtual reality environment to the user. The multimedia may be one of the real-time video streams and a virtual rendering of the real-time video stream. Further to initiating, the method may comprise assigning a robot in the store to the user in virtual reality environment, based on the user data and a user location in the multimedia and generating a set of primary instructions, based on an action of the user in the virtual environment. Subsequent to generating, the method may comprise controlling the one or more robots based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment.
[008] In yet another implementation, non-transitory computer readable medium embodying a program executable in a computing device for enabling a user to remotely perform an action in a real world via a virtual reality environment is disclosed. In one aspect, the
4
program may comprise a program code for receiving user data associated with a user in a virtual reality environment and a real-time video stream associated with a store in a real world. The program may comprise a program code for initiating multimedia associated with the store on a device providing the virtual reality environment to the user. The program may comprise a program code for assigning a robot in the store to the user in virtual reality environment, based on the user data and a user location in the multimedia. The program may comprise a program code for generating a set of primary instructions, based on an action of the user in the virtual environment. The program may comprise a program code for controlling the one or more robots based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment.
BRIEF DESCRIPTION OF THE DRAWINGS
[009] The foregoing detailed description of embodiments is better understood when read in conjunction with the appended drawings. For the purpose of illustrating of the present subject matter, an example of construction of the present subject matter is provided as figures; however, the invention is not limited to the specific method and system disclosed in the document and the figures.
[010] The present subject matter is described detail with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer various features of the present subject matter.
[011] Figure 1 illustrates a network implementation of a system for enabling a user to remotely perform an action in a real world via a virtual reality environment, in accordance with an embodiment of the present subject matter.
[012] Figure 2 illustrates the system and its subcomponents for enabling a user to remotely perform an action in a real world via a virtual reality environment, in accordance with an embodiment of the present subject matter.
[013] Figure 3 illustrates a method for enabling a user to remotely perform an action in a real world via a virtual reality environment, in accordance with an embodiment of the present subject matter.
5
DETAILED DESCRIPTION
[014] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words "comprising," "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods for enabling a user to remotely perform an action in a real world via a virtual reality environment, similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the exemplary, systems and methods for enabling a user to remotely perform an action in a real world via a virtual reality environment are now described. The disclosed embodiments for enabling a user to remotely perform an action in a real world via a virtual reality environment are merely examples of the disclosure, which may be embodied in various forms.
[015] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments for enabling a user to remotely perform an action in a real world via a virtual reality environment. However, one of ordinary skill in the art will readily recognize that the present disclosure for enabling a user to remotely perform an action in a real world via a virtual reality environment is not intended to be limited to the embodiments described, but is to be accorded the widest scope consistent with the principles and features described herein.
[016] In an implementation, a system and method for enabling a user to remotely perform an action in a real world via a virtual reality environment, is described. In an embodiment biometric data of the user may be obtained. In one example, the biometric data may comprise a finger print data, iris data, heart beat data, and face recognition data. Upon obtaining biometric data, the user of device providing the virtual reality environment may be authenticated, based on the biometric data. Further to authentication of the user, a set of secondary instructions for assisting the user in shopping, at the store in the real world may be generated, based on the user data and historical data, user buying behaviour, and financial capacity.
6
[017] In the embodiment, upon generation of the set of secondary instructions, user data associated with the user in a virtual reality environment and a real-time video stream associated with a store in a real world may be received. The user data may comprise user instructions and user preferences. Further, to receiving, multimedia associated with the store may be initiated on a device providing the virtual reality environment to the user. The multimedia may be the real-time video stream or a virtual rendering of the real-time video stream. Upon initiating the multimedia a robot in the store may be assigned to the user in virtual reality environment, based on the user data and a user location in the multimedia. In one example, the user location may understand as the location of the user in the virtual environment analogues to the location in the store.
[018] Further to assigning the robot, a set of primary instructions may be generated based on an action of the user in the virtual environment. In one example, the actions may be selection of a product in the store, or picking up a product in the store. Subsequent to generating of the set of primary instructions, the robot may be controlled based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment.
[019] Referring now to Figure 1, a network implementation 100 of a system 102 for enabling a user to remotely perform an action in a real world via a virtual reality environment, in accordance with an embodiment of the present subject matter may be described. In one embodiment, the present subject matter is explained considering that the system 102. In one example the system 102 may be implemented as a system 102 connected to the network 106. It may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, a cloud-based computing environment, or a mobile and the like. In other example, the system 102 may be installed within a device 104 connected to a network 106. In one embodiment, the system 100 communicatively coupled to one or more camera 112 and one or more robots 108 in a store. In one example, the robot 108 may be a dedicated robot per user. In one other example, the robot 108 may be a robot hand dedicated to a user based on action to be performed. In one example, the robot hands 108 may be locate around the store such that the robot hand have access to the product store in the shelves.
7
[020] In one example, the device 104 may be a virtual reality headset such as PlayStation VR™, Oculus Rift™, and HTC Vive™. It may also be understood that the system 102 supports a plurality of browsers and all viewports. Examples of the plurality of browsers may include, but not limited to, Chrome™, Mozilla™, Internet Explorer™, Safari™, and Opera™. It will also be understood that the system 102 may be accessed by multiple users through one or more controller. In example, the controllers may be a joystick, a game pad, or haptic glows and the like. Further, the system may be communicatively coupled to sensors located on a user. Furthermore, the system 102 may be communicatively coupled to a database for storing data. In one example, the database may be any of the relationship database and the like.
[021] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), Wireless Personal Area Network (WPAN), Wireless Local Area Network (WLAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, MQ Telemetry Transport (MQTT), Extensible Messaging and Presence Protocol (XMPP), Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[022] In one example, the system 102 may receive user data associated with a user in a virtual reality environment from the device 104 and a real-time video stream associated with a store in a real world from the one or more cameras 112 over the network 106. Upon receiving, the system 102 may initiate multimedia associated with the store on the device 104 providing the virtual reality environment to the user over the network 106. The multimedia may be the real-time video stream or a virtual rendering of the real-time video stream. Further to initiating, the system102 may assign the robot 108 in the store to the user in virtual reality environment, based on the user data and a user location in the multimedia over the network 106. The robot assigned may be a dedicated robot to perform all the actions of the user or a robot hand assigned to perform a single action. Further, a new robot hand may be reassigned to the user based on the action to be performed. Subsequent to assigning the robot 108, the
8
system 102 may generate a set of primary instructions, based on an action of the user in the virtual environment on the device 104. Upon generating the set of primary instructions, the system may control the robot 108 based on the set of primary instructions over the network 106, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment. In
[023] Referring now to Figure 2, the system 102 is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 may be configured to fetch and execute computer-readable instructions stored in the memory 206.
[024] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with the user directly or through the client devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[025] The memory 206 may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[026] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include a receiving module 212, an assigning module
9
214, a controlling module 216 and other module 218. The other modules 218 may include programs or coded instructions that supplement applications and functions of the system 102. The modules 208 described herein may be implemented as software modules that may be executed in the cloud-based computing environment of the system 102.
[027] The memory 206, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The memory 206 may include data generated as a result of the execution of one or more modules in the other module 220. In one implementation, the memory may include data 210. Further, the data 210 may include a system data 220 for storing data processed, computed received and generated by one or more of the modules 208. Furthermore, the data 210 may include other data 222 for storing data generated as a result of the execution of one or more modules in the other module 218.
[028] In one implementation, at first, a user may use the controller or mobile 104 to access the system 102 via the I/O interface 204. The user may register using the I/O interface 204 in order to use the system 102. In one aspect, the user may access the I/O interface 204 of the system 102 for obtaining information or providing input information. In one implementation the system 102 may automatically provide information to the user through I/O interface 204, device 104.
RECEIVING MODULE 212
[029] Referring to figure 2, in an embodiment the receiving module 212 may obtain biometric data of the user. In one example, the biometric data may comprise finger print data, iris data, heart beat data, and face recognition data. In one example, the biometric data may be obtained from device 104. Upon obtaining biometric data, the receiving module 212 may authenticate the user of the device104 providing the virtual reality environment based on the biometric data.
[030] Subsequent to the authenticating, the receiving module 212 may generate a set of secondary instructions for assisting the user in shopping, at the store in the real world, based on user preferences for example maximum spending = 5000INR, historical data, user buying behaviour for example frequent premium brand buying, and financial capacity for example money available in the bank account = 2500 INR. In one example, the set of instructions may assist the user to buy a product.
10
[031] In the embodiment, upon generating, the receiving module 212 may receive user data associated with a user in a virtual reality environment and a real-time video stream associated with a store in a real world. Further, the receiving module 212 may receive user data associated with a user in real world. In one example data may comprise the real world user actions of picking an item, selecting an item, checking an item, enquiring for an item.
[032] In one example, the real-time video stream may be received from a set of primary cameras 112 installed within the store and a set of secondary cameras installed on the robot 108. The receiving module may store the real-time video stream in the system data 220.
[033] Further, to receiving the receiving module 212 may initiate multimedia associated with the store on a device providing the virtual reality environment to the user. In one example, the multimedia may be the real-time video stream. In one other example, the multimedia may be a virtual rendering of the real-time video stream. The virtual rendering may be understood as conversion of the real-time video stream in to a 3-dimensional rendering. In the embodiment, the receiving module 212 may modify the multimedia in the virtual reality environment based on one or more user preferences, the user data associated with a user in real world and the second set of instructions.
ASSIGNING MODULE 214
[034] Further in the embodiment, upon initiating the multimedia on the device 104, the assigning module 214 may a robot 108 in the store to the user in virtual reality environment, based on the user data and a user location in the multimedia. In on example, the user data may comprise user preference, for example, buying cloths within 1000 INR. In one other example, user location may be understood as the location of the user in multimedia in the virtual reality environment. For example, a user in the cloths sections of the multimedia in the virtual reality environment will be assigned the robot near corresponding cloths section of the store. In one example, the robot may be a dedicated robot assigned to perform all the actions of the user in the virtual environment. In other example, the robot may be a robot configured to perform a single action out of the plurality of actions of the user in the virtual environment. Further, the assigning module 214 may store the assignment in the system 220.
[035] Subsequent to assigning the robot 108, the assigning module 214 may generate a set of primary instructions, based on an action of the user in the virtual environment. In one example, the user may perform an action of picking up a product in the virtual environment;
11
the assigning module 214 may generate a set of primary instructions mimicking the action. Further, the assigning module 214 may store the set of primary instruction in the system data 220.
CONTROLLING MODULE 216
[036] In the embodiment upon generating a primary set of instructions, the controlling module 216 may control the robot 108 based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment.
[037] In one example, due to the presence of one or more entities in the shop a conflict may occur between the entity in the shop and the user in the virtual world environment when controlling the robot 108. In this example, the controlling module 216 may obtain proximity data from based on one or more sensors located on the robot in the store and identify the presence of one or more entities in the proximity of the robot based on the proximity data. In one example, the one or more entities may be a robot, or a real world person. Further to identifying the entities, the controlling module 216 may identify an action of the entity in the real world based on the real-time video stream and further compare the action of the entities with the action of the user in the virtual world environment. In case the actions are same for example, picking the same product, or moving to the same shelf, reaching for the same product, then the controlling module 216 may resolve a conflict between the user in the virtual environment and the entity in the store based on the comparison of the action of the user in the virtual environment and the action of the entity in the real world. In one example, the resolving may be based on action first performed. In one other example, the entity physically present in the store may get preference over the user in the virtual world environment to perform the action.
[038] Further, the controlling module 216 may display a signal associate with the user. In one example the signal may indicate of the presence of the user in the store. Furthermore, the signal may be displayed on one or more of shelves in the store and on the robots. In one example, the controller module 216 may monitor the selection of the products by the user in the virtual world environment and dispatch the same products to the user’s physical address, upon payment for the products by the user.
[039] Exemplary embodiments for enabling a user to remotely perform an action in a real world via a virtual reality environment discussed above may provide certain advantages.
12
Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
[040] Some embodiments of the system and the method enable ideal immersive experience in the virtual reality environment and shopping.
[041] Some embodiments of the system and the method reduce the feeling associated with purchasing from warehouse or a virtual world of a user in virtual world environment.
[042] Some embodiments of the system and the method enable the user to pick a product by seeing the complete details of the piece etc. similar to real world user.
[043] Some embodiments of the system and the method resolve the conflicts for selecting a product between the real world user and the user in virtual environment.
[044] Some embodiments of the system and the method provide an end to end process for purchasing to deliver the goods to the consumer.
[045] Referring now to Figure 3, a method 300 for enabling a user to remotely perform an action in a real world via a virtual reality environment is shown, in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
[046] The order in which the method 300 for enabling a user to remotely perform an action in a real world via a virtual reality environment as described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be considered to be implemented in the above described system 102.
[047] At block 302, user data associated with a user in a virtual reality environment and a real-time video stream associated with a store in a real world may be received. In an
13
implementation, the receiving module 212 may receive user data associated with a user in a virtual reality environment and a real-time video stream associated with a store in a real world. Further the receiving module 212 may store the user data and real-time video stream in the system data 220.
[048] At block 304, multimedia associated with the store may be initiated on a device providing the virtual reality environment to the user. The multimedia may be one of the real-time video streams, a virtual rendering of the real-time video stream, a combination of virtual rendering and the real-time video streams and. In an implementation, the receiving module 212 may initiate multimedia associated with the store and may store the multimedia in the system data 220.
[049] At block 306, a robot in the store may be assigned to the user in virtual reality environment, based on the user data and a user location in the multimedia. In the implementation, the assigning module 214 may assign a robot in the store to the user in virtual reality environment and store the assignment data in the system data 220.
[050] At block 308, a set of primary instructions may be generated based on an action of the user in the virtual environment. In the implementation, the assigning module 214 may generate a set of primary instructions based on an action of the user in the virtual environment and may store the set of primary instructions in the system data 220.
[051] At block 310, the robot may be controlled based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment. In the implementation, the controlling module 216 may control a robot from based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment.
[052] Exemplary embodiments discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include a method and system for enabling a user to remotely perform an action in a real world via a virtual reality environment.
[053] Although implementations for methods and systems for enabling a user to remotely perform an action in a real world via a virtual reality environment have been described in language specific to structural features and/or methods, it is to be understood that the
14
appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for enabling a user to remotely perform an action in a real world via a virtual reality environment.

WE CLAIM:
1. A method for enabling a user to remotely perform an action in a real world via a virtual reality environment, the method comprising:
receiving, by a processor, user data associated with a user in a virtual reality environment and a real-time video stream associated with a store in a real world;
initiating, by the processor, multimedia associated with the store on a device providing the virtual reality environment to the user, wherein the multimedia is one of the real-time video stream and a virtual rendering of the real-time video stream;
assigning, by the processor, a robot in the store to the user in virtual reality environment, based on the user data and a user location in the multimedia;
generating, by the processor, a set of primary instructions, based on an action of the user in the virtual environment; and
controlling, by the processor, the one or more robots based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment.
2. The method of claim 1, further comprising:
obtaining, by the processor, biometric data of the user wherein the biometric data comprises one of a finger print data, iris data, heart beat data, and face recognition data;
authenticating, by the processor, the user of device providing the virtual reality environment, based on the biometric data; and
generating, by the processor, a set of secondary instructions for assisting the user in shopping, at the store in the real world, based on authentication, the user data and historical data, user buying behaviour, and financial capacity.
3. The method of claim 2, further comprising modifying, by the processor, the multimedia in the virtual reality environment based on one or more user preferences and the set of secondary instructions.
4. The method of claim 1, further comprising displaying, by the processor, a signal associate with the user, wherein the signal is indicative of the presence of the user in the store, wherein the signal is displayed on one or more of shelves in the store and the robot.
16
5. The method of claim 1, further comprising:
obtaining, by the process, proximity data from the robot in the store;
identifying, by the processor, a presence of one or more entities in the proximity of the robot based on the proximity data, wherein one or more entities is one of a robot, and a real world person; and
identifying, by the processor, an action of the entity in the real world based on the real-time video stream; and
resolving, by a processor, a conflict between the user in the virtual environment and the entity in the store based on the comparison of the action of the user in the virtual environment and the action of the entity in the real world.
6. The method of claim 1, wherein the real-time video stream is received from a set of primary cameras installed within the store and a set of secondary cameras installed on the robot.
7. A system for enabling a user to remotely perform an action in a real world via a virtual reality environment, the system comprising:
a memory; and
a processor coupled to the memory, wherein the processor is capable of executing instructions to perform steps of:
receiving user data associated with a user in a virtual reality environment and a real-time video stream associated with a store in a real world;
initiating multimedia associated with the store on a device providing the virtual reality environment to the user, wherein the multimedia is one of the real-time video streams and a virtual rendering of the real-time video stream;
assigning a robot in the store to the user in virtual reality environment, based on the user data and a user location in the multimedia;
generating a set of primary instructions, based on an action of the user in the virtual environment; and
controlling the robot based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment.
8. The system of claim 7, further comprising:
17
obtaining biometric data of the user wherein the biometric data comprises one of finger print data, iris data, heart beat data, and face recognition data;
authenticating the user of the device providing the virtual reality environment, based on the biometric data; and
generating a set of secondary instructions for assisting the user in shopping, at the store in the real world, based on user preferences, historical data user buying behaviour, and financial capacity.
9. The system of claim 8, further comprising modifying the multimedia in the virtual reality environment based on one or more user preferences and the set of secondary instructions.
10. The system of claim 7, further comprising displaying a signal associate with the user, wherein the signal is indicative of the presence of the user in the store, wherein the signal is displayed on one or more of shelves in the store and the robots.
11. The system of claim 7,
obtaining proximity data from based on one or more sensors located on the robot in the store;
identifying a presence of one or more entities in the proximity of the robot based on the proximity data, wherein one or more entities is one of a robot, and a real world person;
identifying an action of the entity in the real world based on the real-time video stream; and
resolving a conflict between the user in the virtual environment and the entity in the store based on the comparison of the action of the user in the virtual environment and the action of the entity in the real world.
12. The system of claim 7, wherein the real-time video stream is received from a set of primary cameras installed within the store and a set of secondary cameras installed on the robot.
13. A non-transitory computer program product having embodied thereon a computer program for enabling a user to remotely perform an action in a real world via an virtual
18
reality environment, the computer program product storing instructions, the instructions comprising instructions for:
obtaining biometric data of the user wherein the biometric data comprises one of a finger print data, iris data, heart beat data, and face recognition data;
authenticating the user of device providing the virtual reality environment, based on the biometric data;
generating a set of secondary instructions for assisting the user in shopping, at the store in the real world, based on authentication, user preferences, historical data user buying behaviour, and financial capacity;
receiving user data associated with the user in a virtual reality environment and a real-time video stream associated with a store in a real world based on the set of secondary instructions;
initiating multimedia on a device providing the virtual reality environment to the user, wherein the multimedia is one of the real-time video streams and a virtual rendering of the real-time video stream;
assigning one or more robots in the store to the user in virtual reality environment, based on the user data;
generating a set of primary instructions, based on an action of the user in the virtual environment; and
controlling the one or more robots based on the set of primary instructions, thereby enabling the user to remotely perform the action in the real world via the virtual reality environment.

Documents

Application Documents

# Name Date
1 201611013284-IntimationOfGrant27-01-2024.pdf 2024-01-27
1 Form 3 [15-04-2016(online)].pdf 2016-04-15
2 201611013284-PatentCertificate27-01-2024.pdf 2024-01-27
2 Form 20 [15-04-2016(online)].pdf 2016-04-15
3 Drawing [15-04-2016(online)].pdf 2016-04-15
3 201611013284-Written submissions and relevant documents [24-11-2023(online)].pdf 2023-11-24
4 Description(Complete) [15-04-2016(online)].pdf 2016-04-15
4 201611013284-FORM-26 [25-10-2023(online)].pdf 2023-10-25
5 abstract.jpg 2016-07-20
5 201611013284-Correspondence to notify the Controller [23-10-2023(online)].pdf 2023-10-23
6 Form 26 [28-07-2016(online)].pdf 2016-07-28
6 201611013284-US(14)-HearingNotice-(HearingDate-10-11-2023).pdf 2023-10-17
7 201611013284-Proof of Right [22-10-2021(online)].pdf 2021-10-22
7 201611013284-OTHERS-040816.pdf 2016-08-08
8 201611013284-FER.pdf 2021-10-17
8 201611013284-Correspondence-040816.pdf 2016-08-08
9 201611013284-FORM 13 [09-07-2021(online)].pdf 2021-07-09
9 201611013284-OTHERS [28-08-2020(online)].pdf 2020-08-28
10 201611013284-FER_SER_REPLY [28-08-2020(online)].pdf 2020-08-28
10 201611013284-POA [09-07-2021(online)].pdf 2021-07-09
11 201611013284-CLAIMS [28-08-2020(online)].pdf 2020-08-28
11 201611013284-COMPLETE SPECIFICATION [28-08-2020(online)].pdf 2020-08-28
12 201611013284-CLAIMS [28-08-2020(online)].pdf 2020-08-28
12 201611013284-COMPLETE SPECIFICATION [28-08-2020(online)].pdf 2020-08-28
13 201611013284-FER_SER_REPLY [28-08-2020(online)].pdf 2020-08-28
13 201611013284-POA [09-07-2021(online)].pdf 2021-07-09
14 201611013284-FORM 13 [09-07-2021(online)].pdf 2021-07-09
14 201611013284-OTHERS [28-08-2020(online)].pdf 2020-08-28
15 201611013284-Correspondence-040816.pdf 2016-08-08
15 201611013284-FER.pdf 2021-10-17
16 201611013284-OTHERS-040816.pdf 2016-08-08
16 201611013284-Proof of Right [22-10-2021(online)].pdf 2021-10-22
17 201611013284-US(14)-HearingNotice-(HearingDate-10-11-2023).pdf 2023-10-17
17 Form 26 [28-07-2016(online)].pdf 2016-07-28
18 abstract.jpg 2016-07-20
18 201611013284-Correspondence to notify the Controller [23-10-2023(online)].pdf 2023-10-23
19 Description(Complete) [15-04-2016(online)].pdf 2016-04-15
19 201611013284-FORM-26 [25-10-2023(online)].pdf 2023-10-25
20 Drawing [15-04-2016(online)].pdf 2016-04-15
20 201611013284-Written submissions and relevant documents [24-11-2023(online)].pdf 2023-11-24
21 201611013284-PatentCertificate27-01-2024.pdf 2024-01-27
22 201611013284-IntimationOfGrant27-01-2024.pdf 2024-01-27

Search Strategy

1 Search201611013284_05-02-2020.pdf
2 2020-11-2216-18-14AE_24-11-2020.pdf

ERegister / Renewals

3rd: 15 Apr 2024

From 15/04/2018 - To 15/04/2019

4th: 15 Apr 2024

From 15/04/2019 - To 15/04/2020

5th: 15 Apr 2024

From 15/04/2020 - To 15/04/2021

6th: 15 Apr 2024

From 15/04/2021 - To 15/04/2022

7th: 15 Apr 2024

From 15/04/2022 - To 15/04/2023

8th: 15 Apr 2024

From 15/04/2023 - To 15/04/2024

9th: 15 Apr 2024

From 15/04/2024 - To 15/04/2025

10th: 03 Apr 2025

From 15/04/2025 - To 15/04/2026