Sign In to Follow Application
View All Documents & Correspondence

A System And Method For Providing Real Time Interaction Using Mixed, Virtual And Augmented Realities

Abstract: ABSTRACT A SYSTEM AND METHOD FOR PROVIDING REAL-TIME INTERACTION USING MIXED, VIRTUAL AND AUGMENTED REALITIES The present disclosure relates to field of system that provides real-time interaction using mixed, virtual and augmented realities, and envisages a system for providing real-time interaction using mixed, virtual and augmented realities comprising database (116), a Virtual Reality (VR) device (102), a user device (104), a communication means (106), and electronic data collection unit (108). The Virtual Reality (VR) device (102) captures and process real-time data associated to a participant, and data is passed to electronic data collection unit (108) via communication means (106). The electronic data collection unit (108) generates tracking data for tracking a virtual image of participant, based on pre-determined set of tracking rules. The virtual image is generated based on tracking data and pre-determined set of virtual image generation rules. The real-time data exchanged between virtual image and participants is captured, based on said pre-determined set of data capturing rules, and transmitted to VR device (102).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
23 December 2017
Publication Number
28/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ipo@knspartners.com
Parent Application

Applicants

ZENSAR TECHNOLOGIES LIMITED
ZENSAR KNOWLEDGE PARK, PLOT # 4, MIDC, KHARADI, OFF NAGAR ROAD, PUNE-411014, MAHARASHTRA, INDIA

Inventors

1. KUMAR Anand Yashwanth
505, V Building, Jade Residences Wagholi, Pune- 412207, Maharashtra, India
2. NAMBIAR Ullas Balan
1086 Prestige Kensington Gardens, Bangalore 560013 Karnataka, India

Specification

DESC:FIELD
The present disclosure is related to a system for providing real-time interaction using mixed, virtual and augmented realities.

DEFINITION

The expression ‘mixed reality’ used in the context of this disclosure refers to, but is not limited to, a hybrid reality that merges real and virtual environments to produce a visualization, where physical and digital objects co-exist and interact in real time.

The expression ‘augmented reality’ used in the context of this disclosure refers to, but is not limited to, a live, direct or indirect view of a physical, real-world environment whose elements are augmented by a computer-generated sensory input such as sound, video, or graphics data.

The expression ‘virtual reality (VR)’ used in the context of this disclosure refers to, but is not limited to, an immersive multimedia or computer-simulated reality that simulates a physical presence in a real world or an imagined world, allowing the participant to interact in that world.

These definitions are in addition to those expressed in the art.
BACKGROUND
Typically, video conferencing provides global interaction between multiple participants who are at different locations. There are many systems that provide video conferencing, such as TeleCon, WebEx, Skype, Google Talk, and the like. But, the video conferencing does not provide feel of the physical presence, i.e. face to face interaction, between the participants who are present at different locations and the participants who are physically present in a room. The physical presence of the participants makes more impact than being on a call or on a video.
Therefore, there is a need to provide a system that limits the aforementioned drawbacks by providing an immersive experience of bringing participants from the different locations to one room for an interactive assembly.
OBJECTS
Some of the objects of the present disclosure, which at least one embodiment herein satisfies, are as follows:
It is an object of the present disclosure to ameliorate one or more problems of the prior art or to at least provide a useful alternative.
An object of the present disclosure is to provide a system and method for providing real-time interaction using mixed, virtual and augmented realities.
Another object of the present disclosure is to provide the system for creating a physical appearance of a virtual participant and other participants.
Yet another object of the present disclosure is to provide the system for facilitating a real-time interaction between a virtual participant and other participants.
Still another object of the present disclosure is to provide the system that operates at real-time processing speed.
Another object of the present disclosure is to provide a system for that is simple and easy to operate.
Other objects and advantages of the present disclosure will be more apparent from the following description, which is not intended to limit the scope of the present disclosure.
SUMMARY
The present disclosure envisages a system for real-time interaction using mixed, virtual and augmented realities between a group of participants located in remote locations. The system comprises a Virtual Reality (VR) device associated with each participant, an electronic data collection unit in each of the remote locations where a participant is located, configured to capture audio-visual data of the location and the participant and further configured to transmit the captured data to each of the electronic data collection units where other participants of the group are present, and still further configured to receive and display the received audio-visual data from other the electronic data collection units associated with the other participants in the group, and a communication means to transmit data from the Virtual Reality (VR) device of the participant to the participant corresponding to the electronic data collection unit and receive audio-visual data from the electronic data collection unit from the other participants of the group from their respective electronic data collection unit into the VR device.
In an embodiment, the system includes a virtual meeting room for interaction of the participants with the other participants virtually. An avatar creating means creates a virtual representation of each participant in the virtual meeting room. An allocator means allocates a seat in the virtual meeting room for each avatar. All electronic data collection units are interfaced with the avatar creating means and communicate with each other via the avatar creating means in the VR device of the participant, and further the participant is able to view audio and visual gestures of the other participants in avatar representation in the virtual meeting room.
In another embodiment, the system includes a database, a Virtual Reality (VR) device, a user device and a communication means. The database is configured to store a pre-determined set of tracking rules, a pre-determined set of virtual image generation rules, and a pre-determined set of data capturing rules.
The Virtual Reality (VR) device is configured to capture real-time data associated to the participant and processes the captured data. The user device is configured to cooperate with the VR device to receive the processed data. The communication means is configured to cooperate with the user device to receive the processed data.
The electronic data collection unit is configured to cooperate with the communication means and the database to receive the processed data. The electronic data collection unit is further configured to generate tracking data for tracking a virtual image of the participant, based on the processed data and the pre-determined set of tracking rules, and to generate the virtual image based on the tracking data and the pre-determined set of virtual image generation rules. The electronic data collection unit is still further configured to capture real-time data exchanged between the virtual image, based on the pre-determined set of data capturing rules, to process the captured data, and transmit the processed data to the VR device.
In yet another embodiment, the Virtual Reality (VR) device (102), user device (104), communication unit (106), and electronic data collection unit are implemented using one or more processor(s).
In still another embodiment, the electronic data collection unit includes a tracking unit, a VR image generation unit and a real-time data processing unit. The tracking unit is configured to cooperate with the communication means and the database to receive the processed data. The tracking unit is further configured to generate tracking data for tracking a virtual image of the participant, based on the processed data and the pre-determined set of tracking rules. The VR image generation unit is configured to cooperate with the tracking unit to generate the virtual image based on the tracking data and the pre-determined set of virtual image generation rules. The real-time data processing unit is configured to cooperate with the VR image generation unit, to capture real-time data exchanged between the virtual image, based on the pre-determined set of data capturing rules. The real-time data processing unit is further configured to process the captured data, and transmit the processed data to the VR device.
In an embodiment, the real-time data processing unit includes a data capturing unit, an analysis unit and a transceiver. The data capturing unit is configured to cooperate with the VR image generation unit, and to capture real-time data exchanged with the virtual image, based on the pre-determined set of data capturing rules. The analysis unit is configured to cooperate with the data capturing unit, to analyze the received data, based on the pre-determined set of data analysis. The transceiver is configured to cooperate with the analysis unit, to transmit the analyzed data to the VR device.
In another embodiment, the real-time data can be selected from a group including, but is not limited to, realistic images, videos, audios, and other sensations that simulate the participant's physical presence in a virtual or imaginary environment.
In yet another embodiment, the VR device is a head-mounted device and includes a display unit, and a headset.
In still another embodiment, the user device may be any electronic device which can be selected from a group including, but is not limited to, a mobile phone, a laptop, a tablet, an iPad, a PDA, a notebook, a net book, a smart device, a smart phone, a personal computer, and a handheld device.
In an embodiment, the virtual image is generated based on the realistic images, videos, audios, other sensations including, but is not limited to, a participant's position, movements, and actions and be represented by a virtual image, or an avatar.
In another embodiment, the data capturing unit is configured to capture live conversation between the group of paticipants and the virtual image of the participant.
In yet another embodiment, the data capturing unit includes a 360 degree VR camera.
In still another embodiment, the communication means can be wired and/or wireless.
The present disclosure envisages a method for real-time interaction using mixed, virtual and augmented realities between a group of participants located in remote locations.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWING
A system and method for providing real-time interaction using mixed, virtual and augmented realities of the present disclosure will now be described with the help of the accompanying drawing, in which:
Figure 1 illustrates a schematic block diagram of a system for providing real-time interaction using mixed, virtual and augmented realities, in accordance with an embodiment of the present disclosure;
Figure 2 illustrates a schematic block diagram of a VR device 102 interacting with the electronic data collection unit 108, and avatar creating means 124;
Figure 3 illustrates a schematic block diagram of a Virtual Meeting Room; and
Figures 4a and 4b illustrate a flowchart of a method for providing real-time interaction using mixed, virtual and augmented realities.
LIST OF REFERENCE NUMERALS USED IN THE DESCRIPTION AND DRAWING

Reference Numeral Reference
100 System
102 Virtual Reality (VR) device
104 User device
106 Communication means
108 Electronic data collection unit
110 Tracking unit
112 VR image generation unit
114 Real-time data processing unit
116 Database
118 Data capturing unit
120 Analysis unit
122 Transceiver
124 Avatar creating means
126 Allocator means

DETAILED DESCRIPTION
Embodiments, of the present disclosure, will now be described with reference to the accompanying drawing.
Embodiments are provided so as to thoroughly and fully convey the scope of the present disclosure to the person skilled in the art. Numerous details are set forth, relating to specific components, and methods, to provide a complete understanding of embodiments of the present disclosure. It will be apparent to the person skilled in the art that the details provided in the embodiments should not be construed to limit the scope of the present disclosure. In some embodiments, well-known processes, well-known apparatus structures, and well-known techniques are not described in detail.
The terminology used, in the present disclosure, is only for the purpose of explaining a particular embodiment and such terminology shall not be considered to limit the scope of the present disclosure. As used in the present disclosure, the forms "a,” "an," and "the" may be intended to include the plural forms as well, unless the context clearly suggests otherwise. The terms "comprises," "comprising," “including,” and “having,” are open ended transitional phrases and therefore specify the presence of stated features, integers, steps, operations, elements, modules, units and/or components, but do not forbid the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The particular order of steps disclosed in the method and process of the present disclosure is not to be construed as necessarily requiring their performance as described or illustrated. It is also to be understood that additional or alternative steps may be employed.
When an element is referred to as being "mounted on," “engaged to,” "connected to," or "coupled to" another element, it may be directly on, engaged, connected or coupled to the other element. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed elements.
The terms first, second, third, etc., should not be construed to limit the scope of the present disclosure as the aforementioned terms may be only used to distinguish one element, component, region, layer or section from another component, region, layer or section. Terms such as first, second, third etc., when used herein do not imply a specific sequence or order unless clearly suggested by the present disclosure.
Terms such as “inner,” “outer,” "beneath," "below," "lower," "above," "upper," and the like, may be used in the present disclosure to describe relationships between different elements as depicted from the figures.
A system for providing real-time interaction using mixed, virtual and augmented realities of the present disclosure is described with reference to Figure 1 of the accompanying drawing.

The system for providing real-time interaction using mixed, virtual and augmented realities between a group of participants located in remote locations (hereinafter referred as “system”) (100) comprises a database (116), a VR device (102), a user device (104), and an electronic data collection unit (108).

The database (116) is configured to store a pre-determined set of tracking rules, a pre-determined set of virtual image generation rules, and a pre-determined set of data capturing rules.

The Virtual Reality (VR) device (102) is associated with each participant and is a head-mounted device worn by the participant. The participant is located at a remote location. The VR device (102) includes a display unit, and a headset (not shown in figure). The display unit is configured to display 360 degree three-dimensional virtual view of a room to the participant. The headset is an audio device that provides three-dimensional sound to the participant. The VR device (102) is configured to capture real-time data associated to a participant and process the captured data. In an embodiment, the processed data can be selected from a group including, but is not limited to, realistic images, videos, audios, and other sensations that simulate the participant's physical presence in a virtual or imaginary environment. In one embodiment, the VR device (102) is a gear device.
The user device (104) is configured to cooperate with the VR device (102) to receive the processed data. In an embodiment, the user device (100) may be any electronic device which can be selected from a group including, but is not limited to, a mobile phone, a laptop, a tablet, an iPad, a PDA, a notebook, a net book, a smart device, a smart phone, a personal computer, a handheld device and the like.
In an embodiment, a participant can be in a room that is remotely located from the other participants employing the user device (104) and the VR device (102). In an embodiment, the user device (104) may include any device that transmits signals and/or real-time data to the network. In one embodiment, the user device (104) may include any type of a communication device, but is not limited to, a voice over Internet protocol (VoIP) device, a telephone (POTS), and the like.
The communication means (106) is configured to cooperate with the user device (104), to receive the processed data. The communication means (106) transmits data from the Virtual Reality (VR) device (102) of the participant to the participant corresponding to the electronic data collection unit (108), and receive audio-visual data from the electronic data collection unit (108) from the other participants of the group from their respective electronic data collection unit (108) into the VR device (102). The communication means (106) can be wired and/or wireless.

The electronic data collection unit (108) is installed in each of the remote locations where a participant is located, and is configured to capture audio-visual data of the location and the participant and is further configured to transmit the captured data to each of the electronic data collection units (108) where other participants of the group are present, and is still further configured to receive and display the received audio-visual data from other the electronic data collection units (108) associated with the other participants in the group.

The electronic data collection unit (108) is configured to cooperate with the communication means (106) and the database (116) to receive the processed data. The electronic data collection unit (108) is further configured to generate tracking data for tracking a virtual image of the participant, based on the processed data and the pre-determined set of tracking rules, and to generate the virtual image based on the tracking data and the pre-determined set of virtual image generation rules. The electronic data collection unit (108) is configured to capture real-time data exchanged between the virtual image, based on the pre-determined set of data capturing rules, to process the captured data, and transmit the processed data to the VR device (102).

The electronic data collection unit (108) includes a tracking unit (110), a VR image generation unit (112), and a real-time data processing unit (114).

The tracking unit (110) is configured to cooperate with the communication means (106) and the database (116) to receive the processed data, The tracking unit (110) is further configured to generate tracking data for tracking a virtual image of the participant, based on the processed data and the pre-determined set of tracking rules.

The VR image generation unit (112) is configured to cooperate with the tracking unit (110) to generate the virtual image based on the tracking data and the pre-determined set of virtual image generation rules.

In another embodiment, the VR image generation unit (112) generates a virtual image of the participant and is further configured to represent a fully three-dimensional image of the participant in an appropriate place, where a group of participant is present. In one embodiment, the three-dimensional image of the participant can be seen without an aid of glasses or other optics. In another embodiment, the VR image generation unit (112) includes, but is not limited to, hololens, and a hologram device.

The real-time data processing unit (114) is configured to cooperate with the VR image generation unit (112), to capture real-time data exchanged between the virtual image, based on the pre-determined set of data capturing rules. The real-time data processing unit (114) is further configured to process the captured data, and transmit the processed data to the VR device (102).

The real-time data processing unit (114) includes a data capturing unit (118), an analysis unit (120) and a transceiver (122). The data capturing unit (118) is configured to cooperate with the VR image generation unit (112), to capture real-time data exchanged with the virtual image, based on the pre-determined set of data capturing rules .

The data capturing unit (118) is installed in a position that the optical range of the data capturing unit (118) covers a plurality of participants sitting in the room. The data capturing unit (118) may be installed in a pre-defined area. The data capturing unit (118) is configured to capture the live conversation between the group of participants and the virtual image of the participant, and is further configured to generate a video signal. The video signal includes a plurality of frames. In an embodiment, the data capturing unit (118) is rotated in a clockwise and/or anticlockwise direction to capture images of each of the participants in the pre-defined area. In one embodiment, each frame is taken for a group of participants, who attends a pre-defined sessions, wherein each participant’s face image represents one frame. In another embodiment, the data capturing unit (118) may be a 360 degree VR camera.

The analysis unit (120) is configured to cooperate with the data capturing unit (118), to analyze the received data, based on the pre-determined set of data analysis.
The transceiver (122) is configured to cooperate with the analysis unit (120), to transmit the analyzed data to the VR device (102).

In an embodiment, the virtual image is generated based on the realistic images, videos, audios, other sensations including, but is not limited to, a participant's position, movements, and actions and be represented by a virtual image, or an avatar.
The electronic data collection unit (108) is configured to cooperate with the real-time data processing unit (114) to receive the analysed data. The electronic data collection unit (108) is configured to analyze the plurality of frames, and generate a three-dimensional session. The generated three-dimensional session is then transmitted to the VR device (102). The display unit of the VR device (102) represents the three-dimensional session to the participant.
In one embodiment, the system (100) provides the real-time interaction between the virtual participant and other participants, where the virtual participant and other participants can talk to each other, exchange digital artifacts, and/or present digital information.
The Virtual Reality (VR) device (102), user device (104), communication unit (106), and electronic data collection unit (108) are implemented using one or more processor(s).
In an embodiment, the system (100) includes a virtual meeting room for interaction of the participants with the other participants virtually. An avatar creating means (124) creates a virtual representation of each participant in the virtual meeting room. An allocator means (126) allocates a seat in the virtual meeting room for each avatar. All electronic data collection units (108) are interfaced with the avatar creating means (124) and communicate with each other via the avatar creating means (124) in the VR device (102) of the participant, and further the participant is able to view audio and visual gestures of the other participants in avatar representation in the virtual meeting room.
The processor is configured to cooperate with the memory to receive and process the pre-determined rules to obtain a set of system operating commands. The processor may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any device that manipulates signals based on operational instructions. Among other capabilities, the processor is configured to fetch and execute the set of predetermined rules stored in the memory to control modules of the system (100).
Figure 2 illustrates the VR device 102 interacting with the electronic data collection unit 108, and avatar creating means 124. There are VR device ranging 102a to 102g interact with the electronic data collection units 108a to 108d, and the avatar creating means 124 creates avatars for each of the participant involved.
Figure 3 illustrates the Virtual Meeting Room, where the participants can view other participants in the form of avatars. The avatars can be seen sitting on the chairs in the Virtual Meeting Room and interacting with each other. The allocator means 126 allocates the seat in the virtual meeting room for each avatar.

Figures 4a and 4b illustrate a flowchart of a method for providing real-time interaction using mixed, virtual and augmented realities.
• Step 202- storing, by a database (116), a pre-determined set of tracking rules, a pre-determined set of virtual image generation rules, and a pre-determined set of data capturing rules;
• Step 204- capturing, by a Virtual Reality (VR) device (102), real-time data associated to the participant and process the captured data;
• Step 206- receiving, by a user device (104), the processed data;
• Step 208- receiving, by a communication means (106), the processed data;
• Step 210- generating, by an electronic data collection unit (108), tracking data for tracking a virtual image of the participant based on the processed data and the pre-determined set of tracking rules;
• Step 212- generating, by the electronic data collection unit (108), the virtual image based on the tracking data and said pre-determined set of virtual image generation rules;
• Step 214- capturing, by the electronic data collection unit (108), real-time data exchanged between the virtual image, based on the pre-determined set of data capturing rules, and to process the captured data; and
• Step 216- transmitting, by the electronic data collection unit (108), the processed data to the VR device (102).
TECHNICAL ADVANCEMENTS
The present disclosure described herein above has several technical advantages including, but not limited to, the realization of a system for real-time interaction using mixed, virtual and augmented realities, that:
• provides a physical appearance of a virtual participant and other participants;
• provides real-time interaction between a virtual participant and other participants;
• operates at real-time processing speed; and
• is simple and easy to operate.
The embodiments herein and the various features and advantageous details thereof are explained with reference to the non-limiting embodiments in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
The foregoing description of the specific embodiments so fully revealed the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
The use of the expression “at least” or “at least one” suggests the use of one or more elements or ingredients or quantities, as the use may be in the embodiment of the disclosure to achieve one or more of the desired objects or results.
Any discussion of documents, acts, materials, devices, articles or the like that has been included in this specification is solely for the purpose of providing a context for the disclosure. It is not to be taken as an admission that any or all of these matters form a part of the prior art base or were common general knowledge in the field relevant to the disclosure as it existed anywhere before the priority date of this application.
The numerical values mentioned for the various physical parameters, dimensions or quantities are only approximations and it is envisaged that the values higher/lower than the numerical values assigned to the parameters, dimensions or quantities fall within the scope of the disclosure, unless there is a statement in the specification specific to the contrary.
While considerable emphasis has been placed herein on the components and component parts of the preferred embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the disclosure. These and other changes in the preferred embodiment as well as other embodiments of the disclosure will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the disclosure and not as a limitation.
,CLAIMS:WE CLAIM:
1. A system (100) for real-time interaction using mixed, virtual and augmented realities between a group of participants located in remote locations which comprises:
• a Virtual Reality (VR) device (102) associated with each participant;
• an electronic data collection unit (108) in each of said remote locations where a participant is located, configured to capture audio-visual data of said location and said participant and further configured to transmit said captured data to each of said electronic data collection units (108) where other participants of said group are present, and still further configured to receive and display said received audio-visual data from other said electronic data collection units (108) associated with said other participants in said group; and
• communication means (106) configured to
a. transmit data from said electronic data collection units (108) to at least one Virtual Reality (VR) device (102) associated to said electronic data collection unit (108); and
b. receive and transmit data between electronic data collection units (108) of said participants within said group via an avatar creating means (124).
2. The system (100) as claimed in claim 1, wherein said system (100) includes:
• a virtual meeting room for interaction of said participants with said other participants virtually;
said avatar creating means (124) for creating a virtual representation of each participant in said virtual meeting room;
• an allocator means (126) for allocating a seat in said virtual meeting room for each avatar,
wherein said all electronic data collection units (108) are interfaced with said avatar creating means (124) and communicate with each other via said avatar creating means (124) in said VR device (102) of said participant, and further said participant is able to view audio and visual gestures of said other participants in avatar representation in said virtual meeting room.
3. The system (100) as claimed in claim 1, wherein said system (100) includes:
• a database (116) configured to store a pre-determined set of tracking rules, a pre-determined set of virtual image generation rules, and a pre-determined set of data capturing rules;
• a Virtual Reality (VR) device (102) configured to capture real-time data associated to said participant, said Virtual Reality (VR) device (102) further configured to process said captured data;
• a user device (104) configured to cooperate with said VR device (102) to receive said processed data;
• said communication means (106) configured to cooperate with said user device (104), to receive said processed data; and
• said electronic data collection unit (108) configured to cooperate with said communication means (106) and said database (116) to receive said processed data, said electronic data collection unit (108) further configured to generate tracking data for tracking a virtual image of said participant, based on said processed data and said pre-determined set of tracking rules, and to generate said virtual image based on said tracking data and said pre-determined set of virtual image generation rules, said electronic data collection unit (108) still further configured to capture real-time data exchanged between said virtual image, based on said pre-determined set of data capturing rules, to process said captured data, and transmit said processed data to said VR device (102),
wherein said Virtual Reality (VR) device (102), said user device (104), said communication unit (106), and said electronic data collection unit (108) are implemented using one or more processor(s).
4. The system (100) as claimed in claim 1, wherein said electronic data collection unit (108) includes
• a tracking unit (110) configured to cooperate with said communication means (106) and said database (116) to receive said processed data, said tracking unit (110) further configured to generate tracking data for tracking a virtual image of said participant, based on said processed data and said pre-determined set of tracking rules;
• a VR image generation unit (112) configured to cooperate with said tracking unit (110) to generate said virtual image based on said tracking data and said pre-determined set of virtual image generation rules; and
• a real-time data processing unit (114) configured to cooperate with said VR image generation unit (112), to capture real-time data exchanged between said virtual image, based on said pre-determined set of data capturing rules, said real-time data processing unit (114) further configured to process said captured data, and transmit said processed data to said VR device (102).
5. The system (100) as claimed in claim 4, wherein said real-time data processing unit (114) includes:
• a data capturing unit (118) configured to cooperate with said VR image generation unit (112), to capture real-time data exchanged with said virtual image, based on said pre-determined set of data capturing rules;
• an analysis unit (120) configured to cooperate with said data capturing unit (118), to analyze said received data, based on said pre-determined set of data analysis; and
• a transceiver (122) configured to cooperate with said analysis unit (120), to transmit said analyzed data to said VR device (102).
6. The system (100) as claimed in claim 4, wherein said real-time data can be selected from a group including, but is not limited to, realistic images, videos, audios, and other sensations that simulate the participant's physical presence in a virtual or imaginary environment.
7. The system (100) as claimed in claim 1, wherein said VR device (102) is a head-mounted device and includes a display unit, and a headset.
8. The system (100) as claimed in claim 3, wherein said user device (104) may be any electronic device which can be selected from a group including, but is not limited to, a mobile phone, a laptop, a tablet, an iPad, a PDA, a notebook, a net book, a smart device, a smart phone, a personal computer, and a handheld device.
9. The system (100) as claimed in claim 3, wherein said virtual image is generated based on the realistic images, videos, audios, other sensations including, but is not limited to, a participant's position, movements, and actions and be represented by a virtual image, or an avatar.
10. The system (100) as claimed in claim 5, wherein said data capturing unit (118) is configured to capture live conversation between said group of paticipants and said virtual image of said participant.
11. The system (100) as claimed in claim 10, wherein said data capturing unit (118) includes a 360 degree VR camera.
12. The system (100) as claimed in claim 3, wherein said communication means (106) can be wired and/or wireless.
13. A method for providing real-time interaction using mixed, virtual and augmented realities, said method comprising the following steps:
• storing, by a database (116), a pre-determined set of tracking rules, a pre-determined set of virtual image generation rules, and a pre-determined set of data capturing rules;
• capturing, by a Virtual Reality (VR) device (102), real-time data associated to said participant and process said captured data;
• receiving, by a user device (104), said processed data;
• receiving, by a communication means (106), said processed data;
• generating, by an electronic data collection unit (108), tracking data for tracking a virtual image of said participant based on said processed data and said pre-determined set of tracking rules;
• generating, by said electronic data collection unit (108), said virtual image based on said tracking data and said pre-determined set of virtual image generation rules;
• capturing, by said electronic data collection unit (108), real-time data exchanged between said virtual image, based on said pre-determined set of data capturing rules, and to process said captured data; and
• transmitting, by said electronic data collection unit (108), said processed data to said VR device (102).

Documents

Application Documents

# Name Date
1 201721046512-STATEMENT OF UNDERTAKING (FORM 3) [23-12-2017(online)].pdf 2017-12-23
2 201721046512-PROVISIONAL SPECIFICATION [23-12-2017(online)].pdf 2017-12-23
3 201721046512-PROOF OF RIGHT [23-12-2017(online)].pdf 2017-12-23
4 201721046512-POWER OF AUTHORITY [23-12-2017(online)].pdf 2017-12-23
5 201721046512-FORM 1 [23-12-2017(online)].pdf 2017-12-23
6 201721046512-DRAWINGS [23-12-2017(online)].pdf 2017-12-23
7 201721046512-DECLARATION OF INVENTORSHIP (FORM 5) [23-12-2017(online)].pdf 2017-12-23
8 201721046512-ENDORSEMENT BY INVENTORS [21-12-2018(online)].pdf 2018-12-21
9 201721046512-DRAWING [21-12-2018(online)].pdf 2018-12-21
10 201721046512-COMPLETE SPECIFICATION [21-12-2018(online)].pdf 2018-12-21
11 Abstract1.jpg 2019-05-31
12 201721046512-FORM 18 [25-10-2019(online)].pdf 2019-10-25
13 201721046512-RELEVANT DOCUMENTS [18-10-2021(online)].pdf 2021-10-18
14 201721046512-FORM 13 [18-10-2021(online)].pdf 2021-10-18
15 201721046512-FER.pdf 2021-10-18
16 201721046512-OTHERS [13-12-2021(online)].pdf 2021-12-13
17 201721046512-FER_SER_REPLY [13-12-2021(online)].pdf 2021-12-13
18 201721046512-COMPLETE SPECIFICATION [13-12-2021(online)].pdf 2021-12-13
19 201721046512-CLAIMS [13-12-2021(online)].pdf 2021-12-13
20 201721046512-US(14)-HearingNotice-(HearingDate-20-01-2025).pdf 2024-12-24
21 201721046512-FORM-26 [16-01-2025(online)].pdf 2025-01-16
22 201721046512-Correspondence to notify the Controller [16-01-2025(online)].pdf 2025-01-16
23 201721046512-Written submissions and relevant documents [04-02-2025(online)].pdf 2025-02-04

Search Strategy

1 2021-06-2214-31-37E_22-06-2021.pdf