Abstract: The present invention relates to a system for providing training process during weapons training in a virtual environment and particularly, relates to a system for providing human on human combat training for the weapon simulators. The system comprises of one or more combat stations; one or more motion tracking devices for tracking multiple body movements of a trainee; one or more dummy weapons; one or more processing units and one or more display systems. The processing unit is provided with inbuilt software for receiving, storing, processing the data received from the motion tracking device, determining current posture of the trainee and replicating the posture onto the virtual human combatant in virtual environment. Further, the combat stations are networked permitting the virtual human combatant to be mapped to one or more trainees in different combat stations. Figure 1
SYSTEM AND METHOD OF PROVIDING VIRTUAL HUMAN ON HUMAN COMBAT TRAINING OPERATIONS FIELD OF INVENTION
The present invention relates to a system for providing training process during weapons training in a virtual environment and particularly, relates to a system for providing human on human combat training for the weapon simulators. Further, the present invention relates to a method of providing human on human combat training. The present invention helps to provide extremely realistic real life like scenario during virtual combat training operations. Further, the present invention enables combat training for multiple combatants engaged against each other and group-training exercises spread across different geographical locations.
BACKGROUND ART
Virtual combat training is a simulated combat environment designed to train military, homeland security and law enforcement personnel. Virtual combat training trains these personnel with specific skills needed in confronting and resolving potential and actual conflicts. Unlike the traditional form of live action training in created combat environments, the new form of virtual combat training provides the most realistic hands on experience that closely resemble the real combat situations.
It is well known in the systems for training soldiers in the use of firearms under simulated combat conditions. Also known in the prior art, are systems that combines data obtained from various sensor devices that allow users to control the movements of their representation in a virtual world especially in the field of computer games, 3D simulations and virtual reality applications. Also, virtual training simulator system consisting of vehicles equipped with armor and weapons similar to the vehicles used by military personnel on field exists. Major drawbacks faced by the existing simulator systems are that these simulated enemies move much slower and in a similar fashion unlike in real combat situations where enemies attack and move faster.
US6215498 describes a networked computer based apparatus for creating a three dimensional virtual work environment wherein terminal users are depicted as avatars and their actions and information are input into the virtual work environment through their corresponding avatars. However, this is a terminal apparatus.
US 7791808 relates to a sport simulation having a wireless position tracker for continuously tracking and determining in real time, the player's three dimensional positional changes. A virtual opponent is responsive to and interactive with, the player in real time.
US 7542040 describes an apparatus for interfacing 3D movements of a user to control the locomotion of an avatar in a virtual environment. The tracking of the avatar is implemented by force sensor, position sensor and foot area tracking.
US5913727 relates to a game apparatus in which players interact with computer-generated images. Position sensing and impact generating means are also provided.
US 20060030383 describes a method and apparatus for controlling and providing force feedback to a user operating a human/computer interface device and interacting with a computer-generated simulation.
EP2141632 relates to an apparatus and a method for creating real-time movements of a three dimensional virtual character with a small number of sensors. The sensors used are measurement sensors.
US7826641 describes an apparatus and method for optically inferring an absolute pose of a manipulated object used by human users in a real three-dimensional environment.
Accordingly, an advantage would be obtained if the trainee were provided with an extremely realistic real life like scenario during virtual combat training operations by use of motion tracking devices to determine the current posture of a user and replicate this posture for a human character in the virtual world for other trainees to view and fire. This is most beneficial during tactical training for the trainees. Various other features of the system and method of the present invention will become obvious to those skilled in the art upon reading the disclosure set forth hereinafter.
OBJECTS OF INVENTION
Thus the primary object of the present invention is directed to a system for providing virtual human on human combat training for weapon simulators.
A further object of the present invention is to provide virtual human on human combat training between an individual and a simulated second individual in the training located remotely.
A still further object of the present invention is to provide virtual image to anticipate and predict the movement of the trainee and to change the virtual image accordingly.
A further object of the present invention is directed to a method for providing virtual combat training.
A further object of the present invention is directed to a method of providing video capture of the motions of a trainee and of projection of a combatant's (virtual human combatant) image onto a screen.
It is another object of the present invention, wherein the system is implemented with inbuilt software for receiving, storing, processing the data received from a motion tracking device, determining current posture of the trainee and replicating the posture onto the virtual human combatant in virtual environment.
It is another object of the present invention, wherein the trainee movements are captured using the motion tracking device with the help of markers.
It is another object of the present invention, wherein using motion capture technology a virtual human combatant is created in the virtual reality environment.
It is another object of the present invention, wherein using the markers the current pose of the trainee in the real world is tracked and used as reference on a digital model in the virtual environment.
It is another object of the present invention, wherein the combatant interacts with the other combatants in the virtual reality world.
It is another object of the present invention, wherein the movement of the other virtual human combatant is controlled by another trainee.
It is another object of the present invention, wherein the training could be extended across multiple combatants (opponents engaged) against each other and group-training exercises spread across different geographical locations.
It is another object of the present invention, wherein the trainees are thus co-located or network connected.
SUMMARY OF INVENTION
Thus according to the basic aspect of the present invention there is provided a system for providing virtual combat training using training simulator comprising of:
One or more combat stations;
One or more motion tracking devices for tracking multiple body movements of a trainee;
One or more dummy weapons;
One or more processing units; and
One or more display systems, wherein the motion tracking device further comprising of one or more passive markers positioned on the body and dummy weapon,wherein the motion tracking device captures information on position and orientation of the trainee's body and dummy weapon, wherein the dummy weapon is provided with a trigger sensing unit for the trainee to shoot one or more virtual human combatants, wherein the processing unit determines the position and orientation information and maps the same to the virtual human combatant in real time, wherein the display system comprising of one or more projectors and screen projects the virtual human combatant and provides immersive display of 3D virtual environment, and wherein the combat stations are networked permitting the virtual human combatant to be mapped to one or more trainees in different combat stations.
It is another aspect of the present invention, wherein the motion tracking device captures both 3 Degrees of Freedom and 6 Degrees of Freedom motion and tracks the movement of the trainee in the immersive virtual environment.
It is further aspect of the present invention, wherein the motion tracking device uses optical tracking based posture determination.
It is another aspect of the present invention, wherein the projector screen provides both a moving background scene and a variable three dimensional target image.
It is another aspect of the present invention, wherein the processing unit is provided with inbuilt software for receiving, storing, processing the information from the motion tracking device, determining current posture of the trainee and replicating the posture onto the virtual human combatant in the virtual environment.
It is another aspect of the present invention, wherein the motion tracking technology is based on optical or infrared cameras or magnetic sensors or any other means for capturing both 3 Degrees of Freedom and 6 Degrees of Freedom motion.
It is another aspect of the present invention, wherein the system facilitates interaction between the trainee and the virtual human combatant or between the trainee and the trainee in a virtual networked environment.
It is another aspect of the present invention, wherein the system provides background scene and required environmental conditions for the combat training.
It is another aspect of the present invention, wherein the system anticipates and predicts the movement of the trainee and changes the virtual human combatant accordingly.
It is another aspect of the present invention, wherein the system calculates combative motions between two virtual human combatants.
It is another aspect of the present invention, wherein the training is for multiple combatants engaged against each other and group training exercises spread across different geographical locations.
In another aspect of the present invention there is provided a method of providing virtual combat training using the said system, the method comprising:
Positioning of passive markers on trainee's body and dummy weapon;
Tracking multiple body movements of the trainee in real world using motion tracking device with the help of markers;
Capturing information on position and orientation of the trainee's body and dummy weapon;
Sending the captured information to computer by communication means;
Determining the position and orientation information and mapping the same to a virtual human combatant in virtual environment through in-built software;
Projecting the virtual human combatant on the display system; and
Providing immersive display of 3D virtual environment during virtual combat training operations.
BRIEF DESCRIPTION OF THE DRAWING
Figure 1: Illustrates system according to the present invention.
DETAILED DESCRIPTION OF THE INVENTION WITH REFERENCE TO THE ACCOMPANYING DRAWING
The present invention is directed to a system and method for providing virtual combat training using training simulator. The virtual combat training is provided between trainees in a training facility located remotely. Referring to Figure 1, the system for providing virtual combat training using training simulator comprises of one or more combat stations, one or more motion tracking device, one or more dummy weapons, one or more processing units and one or more display systems. The motion tracking device further comprises of multiple passive markers positioned on the trainee's body and dummy equipment (weapon) providing the information on position and orientation. Markers are devices/ combination of devices, which are placed on various parts of the trainee's body and head. The markers include active/passive/magnetic/ultrasonic type markers. The display systems further comprises of one or more projectors) and screen(s) surrounding operational field of view, which projects the virtual human combatant and provides immersive display of the virtual environment. The processing unit is provided with inbuilt software. The software receives stores, processes the data received from the motion tracking device, determines the current posture of the trainee and replicates the posture onto the virtual human combatant (Avatar) in a virtual environment. The dummy weapon is provided with a trigger sensing unit for the trainee to shoot one or more virtual human combatants. When the trainee interacts with the dummy weapon, the virtual bullets from weapon interact with the virtual 3D environment. The combat stations are networked permitting the virtual human combatant to be mapped to one or more trainees in different combat stations. The trainees are thus co-located or network connected.
The immersive environment with multiple display system provided in each combat station is connected through a network where different trainees can interact with each other. Thus the system facilitates interaction between the trainee and the virtual human combatant or between the trainee and the trainee in a virtual networked environment. The motion tracking technology is based on optical or infrared cameras or magnetic sensors or any other means for capturing both 3 Degrees of Freedom and 6 Degrees of Freedom extraction of body posture. The motion tracking device use optical tracking based posture determination and captures both 3 Degrees of Freedom and 6 Degrees of Freedom [DOF] motion and tracks the movement of the trainee in the immersive virtual environment. Using the markers the current pose or movements of the trainee in the real world is tracked using digital image processing technique and used as reference on the digital model (virtual human combatant) in the virtual environment.
In the marker-based tracking, passive markers are placed as a set of triangular pattern on the trainee's head, which is tracked, by the set of motion tracking devices to get position and orientation of the markers. The captured images from multiple passive motion tracking devices are sent to a computer by means of standard communication unit. The said subset is extracted from the processing unit/computer. The position and orientation of the trainee's body and weapon extracted from the digital images on the processing unit is mapped to the virtual human combatant. The virtual human combatant is then projected on to the display systems. The projector projects a still picture or a moving picture and a variable three dimensional target image for the background scene for view by the trainee. The detection of the marker pattern is used for tracking the position and orientation. Orientation information is required since trainee's can peek, bend and the same has to be mapped accurately to the virtual human combatant. Thus, the system determines current posture of the trainee in the real world and replicates it onto the virtual combatant in the virtual environment for other trainees to view and fire. The movement of the other virtual human combatant is controlled by another trainee or is software simulated.
The system of the present invention provides background scene and required
environmental conditions for combat training, calculates combative motions between two virtual human combatants / trainees and compares with respect to each other, anticipate and predicts the movement of the trainee and changes the virtual human combatant accordingly. The software stores and retrieves the position data for further processing and generates sound based on the interaction.
The virtual 3D environment is generated by 3D graphics, not a video generator (2D) and is not related to terminal application but related to a 3D immersive stereoscopic virtual environment with multiple display system with network facility. The 3D immersive stereoscopic virtual environment apparatus can be used for both first and second trainee, which is scalable to multiple trainees. The advantage of this setup is that the trainees (eg: soldiers) can have combat training against each other in virtual environment. Stereo goggles are provided for viewing the virtual environment in real 3D view in addition to body impact generator to simulate the bullet hit and perspective corrected 3D image to give the true scale perspective view of the trainee. The training could be extended across multiple combatants (opponents) engaged against each other and group-training exercises spread across different geographical locations. An instructor sets up the virtual environment. Activities are performed on a 3D immersive stereoscopic virtual environment, not on any particular simulator.
A method of providing virtual combat training using the said system is described in detail. In a weapon training simulator, trainees move inside the immersive virtual environment. The trainee's movement in the real world is tracked using both 3 DOF and 6 DOF motions tracking devices with the help of markers. Essentially markers are placed on the head of the trainee for tracking the position and orientation of the head of the trainee. Also markers are placed on the weapon which the trainee holds in his/her hand for tracking the weapon position and orientation along with point of aim. Markers are also placed on the trainee's shoulders, arms, torso, limb and ankle (passive optical marker) for full body tracking. Using these markers, the motion tracking device tracks and determines the current posture of the trainee in the real world, which is used as reference on the digital model in the virtual environment for other trainees to view and fire upon. The method is a combination of apparatus tracking and body tracking. The method uses optical tracking based posture determination consisting of:
1) Motion tracking device for tracking the multiple body portions of the trainee.
2) Markers placed on multiple body portions of the trainee, which is being tracked by the motion tracking device.
3) Direct mapping the multiple body portions of the trainee to the virtual human combatant.
The inbuilt system software estimates current posture of the trainee and replicates the posture in the virtual environment. A general anatomical digital model of the trainee is created in the software. The trainee's position is tracked using the motion tracking device, which records the trainee movement based on location of the markers and translates/maps the positions and movements on to a digital model in the virtual environment using the in built software. Based on this data, an approximation of the trainee pose is generated in the virtual world. The more the number of markers on the body of the trainee the more realistic and accurate would be the trainee's digital model movement in the virtual environment.
As the virtual human combatant can move in the simulated environment, it would provide the real challenge to a trainee to actually aim and fire at another individual who is represented by his/her 3D virtual image on the screen and can move as well in the virtual environment. The movement of the other virtual human combatant is controlled by another trainee. As the trainee can see the other individual's 3D image on his/her screen, this provides an extremely realistic real life like scenario during virtual combat training operations. This is most beneficial during tactical training. The training is conducted between two or multiple opponents in the virtual environment.
The advantage of the present invention is that it uses a straightforward method of tracking the body portion using motion tracking devices, markers and a processing unit to determine the posture of the trainee's body and mapping it to the virtual human combatant. Multiple motion tracking devices improve the accuracy of position and orientation sensing and improve the tracking coverage volume.
WE CLAIM:
1. A system for providing virtual combat training using training simulator comprising of:
One or more combat stations;
One or more motion tracking devices for tracking multiple body movements of a trainee;
One or more dummy weapons;
One or more processing units; and
One or more display systems,
wherein the motion tracking device further comprising of one or more passive markers positioned on the body and dummy weapon,
wherein the motion tracking device captures information on position and orientation of the trainee's body and dummy weapon,
wherein the dummy weapon is provided with a trigger sensing unit for the trainee to shoot one or more virtual human combatants,
wherein the processing unit determines the position and orientation information and maps the same to the virtual human combatant in real time,
wherein the display system comprising of one or more projectors and screen projects the virtual human combatant and provides immersive display of 3D virtual environment, and
wherein the combat stations are networked permitting the virtual human combatant to be mapped to one or more trainees in different combat stations.
2. A system as claimed in claim 1, wherein the motion tracking device captures both 3 Degrees of Freedom and 6 Degrees of Freedom motion and tracks the movement of the trainee in the immersive virtual environment.
3. A system as claimed in claim 2, wherein the motion tracking device uses optical tracking based posture determination.
4. A system as claimed in claim 1, wherein the projector screen provides both a moving background scene and a variable three dimensional target image.
5. A system as claimed in claim 1, wherein the processing unit is provided with inbuilt software for receiving, storing, processing the information from the motion tracking device, determining current posture of the trainee and replicating the posture onto the virtual human combatant in the virtual environment.
6. A system as claimed in anyone of claims 1 to 5, wherein the motion tracking technology is based on optical or infrared cameras or magnetic sensors or any other means for capturing both 3 Degrees of Freedom and 6 Degrees of Freedom motion.
7. A system as claimed in anyone of claims 1 to 6, wherein the system facilitates interaction between the trainee and the virtual human combatant or between the trainee and the trainee in a virtual networked environment.
8. A system as claimed in anyone of claims 1 to 7, wherein the system provides background scene and required environmental conditions for the combat training.
9. A system as claimed in anyone of claims 1 to 8, wherein the system anticipates and predicts the movement of the trainee and changes the virtual human combatant accordingly.
10. A system as claimed in anyone of claims 1 to 9, wherein the system calculates combative
motions between two virtual human combatants.
| # | Name | Date |
|---|---|---|
| 1 | 4385-CHE-2011 POWER OF ATTORNEY 14-12-2011.pdf | 2011-12-14 |
| 1 | abstract4385-CHE-2011.jpg | 2014-03-14 |
| 2 | 4385-CHE-2011 ABSTRACT 06-12-2012.pdf | 2012-12-06 |
| 2 | 4385-CHE-2011 FORM -3 14-12-2011.pdf | 2011-12-14 |
| 3 | 4385-CHE-2011 FORM -2 14-12-2011.pdf | 2011-12-14 |
| 3 | 4385-CHE-2011 CORRESPONDENCE OTHERS. 06-12-2012.pdf | 2012-12-06 |
| 4 | 4385-CHE-2011 FORM -1 14-12-2011.pdf | 2011-12-14 |
| 4 | 4385-CHE-2011 FORM-1 06-12-2012.pdf | 2012-12-06 |
| 5 | 4385-CHE-2011 DRAWING 14-12-2011.pdf | 2011-12-14 |
| 5 | 4385-CHE-2011 FORM-2 06-12-2012.pdf | 2012-12-06 |
| 6 | 4385-CHE-2011 DESCRIPTION (PROVISIONAL) 14-12-2011.pdf | 2011-12-14 |
| 6 | 4385-CHE-2011 FORM-5 06-12-2012.pdf | 2012-12-06 |
| 7 | 4385-CHE-2011 CORRESPONDENCE OTHERS 14-12-2011.pdf | 2011-12-14 |
| 7 | 4385-CHE-2011 CLAIMS 06-12-2012.pdf | 2012-12-06 |
| 8 | 4385-CHE-2011 DESCRIPTION(COMPLETE) 06-12-2012.pdf | 2012-12-06 |
| 8 | 4385-CHE-2011 POWER OF ATTORNEY 06-12-2012.pdf | 2012-12-06 |
| 9 | 4385-CHE-2011 DRAWINGS 06-12-2012.pdf | 2012-12-06 |
| 10 | 4385-CHE-2011 POWER OF ATTORNEY 06-12-2012.pdf | 2012-12-06 |
| 10 | 4385-CHE-2011 DESCRIPTION(COMPLETE) 06-12-2012.pdf | 2012-12-06 |
| 11 | 4385-CHE-2011 CORRESPONDENCE OTHERS 14-12-2011.pdf | 2011-12-14 |
| 11 | 4385-CHE-2011 CLAIMS 06-12-2012.pdf | 2012-12-06 |
| 12 | 4385-CHE-2011 DESCRIPTION (PROVISIONAL) 14-12-2011.pdf | 2011-12-14 |
| 12 | 4385-CHE-2011 FORM-5 06-12-2012.pdf | 2012-12-06 |
| 13 | 4385-CHE-2011 DRAWING 14-12-2011.pdf | 2011-12-14 |
| 13 | 4385-CHE-2011 FORM-2 06-12-2012.pdf | 2012-12-06 |
| 14 | 4385-CHE-2011 FORM -1 14-12-2011.pdf | 2011-12-14 |
| 14 | 4385-CHE-2011 FORM-1 06-12-2012.pdf | 2012-12-06 |
| 15 | 4385-CHE-2011 FORM -2 14-12-2011.pdf | 2011-12-14 |
| 15 | 4385-CHE-2011 CORRESPONDENCE OTHERS. 06-12-2012.pdf | 2012-12-06 |
| 16 | 4385-CHE-2011 FORM -3 14-12-2011.pdf | 2011-12-14 |
| 16 | 4385-CHE-2011 ABSTRACT 06-12-2012.pdf | 2012-12-06 |
| 17 | abstract4385-CHE-2011.jpg | 2014-03-14 |
| 17 | 4385-CHE-2011 POWER OF ATTORNEY 14-12-2011.pdf | 2011-12-14 |