Abstract: The present invention relates to a system for providing training during fire under cover process for weapon simulators in a virtual environment. Further, the present invention provides a system for generating perspective corrected imagery in virtual combat training station comprising motion tracking device for tracking head movement of a trainee; processing unit and one or more display systems. The motion tracking device captures information on position and orientation of the trainee's head. The processing unit is provided with inbuilt software for receiving, storing, processing the information received from the motion tracking device and providing corrected imagery according to the calculated perspective of the trainee for use in virtual combat training. Figure 1.
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14,2011
SYSTEM AND METHOD OF GENERATING PERSPECTIVE CORRECTED IMAGERY FOR USE IN VIRTUAL COMBAT TRAINING
FIELD OF INVENTION
The present invention relates to a system for providing training process during weapons training in a virtual environment and particularly to a system for providing training during fire under cover process for the weapon simulators. Further, the present invention relates to a system and method of generating perspective corrected imagery for use in virtual combat training. The present invention helps to provide extremely realistic real life like scenario during virtual combat training operations. This is most beneficial during tactical training for trainees.
BACKGROUND ART
Virtual combat training is a simulated combat environment designed to train military, homeland security and law enforcement personnel with specific skills needed in confronting and resolving potential and actual conflicts. Virtual combat training makes use of computers and other technological devices to create varied virtual environments similar to the real combat world, under which participating personnel can be trained. In the past, many different types of target practice and aiming devices have been suggested that use light to simulate the firing of a gun. Such devices help train and instruct shooters by enabling them to practice aiming at a target either indoors or on an open range without actually making use of real projectiles (e.g. shot charges or bullets). The position of a projectile can be simulated by a computer and compared with the target position in order to determine whether the aim is correct.
It is well known in the art that laser type weapon fire simulation systems with visual score indicator means and holographic means are capable of producing a three dimensional virtual image of a target.
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14,2011
Also known in the prior art is a three dimensional imaging technique for a more realistic training experience for a person undergoing firearms training which creates the illusion of depth in an image or in a video. The person undergoing the training views the image with a 3D lens and acquires the target with an actual defensive weapon.
US6296486 describes a simulator able to simulate imaginary firings. An instructor chooses a virtual scenario relating to a firing field that is displayed on the display device, the type of the missile and firing conditions. The images are generated by an image processing device.
US6500008 relates to an augmented reality-based firefighter training system. The system includes hardware and software components to create realistic-looking fire, smoke, and extinguishing graphics for interactive training experience for firefighters. However, the said training system provides mixed view of both real and virtual world.
US6765726 relates to a sport simulator having sensing electronics for determining in real time the player's three dimensional positional changes and computer controlled sport specific cuing that prompts sport specific responses from the player.
US20030227453 describes a system and software for creating animated 3-D scenarios from human position and path data. Immersive and non-immersive virtual reality technologies are used for the training.
US20050219240 describes a hands-on simulator system having a display that can project horizontal perspective images into the open space and a device that allows the user to manipulate the images with hands or hand-held tools.
US20080158256 relates to a system and method for fusing a plurality of sensor data to provide a perspective view image.
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14, 2011
US20110009241 relates to a virtual locomotion controller user interface and system that combines data from sensor devices to allow users to control their representations in a virtual world.
Accordingly an advantage would be obtained if the trainee were provided with an extremely realistic real life like scenario during virtual combat training operations by use of motion tracking devices to generate perspective corrected imagery for training to fire under cover in simulators. This is most beneficial during tactical training for the trainees. Various other features of the system and method of the present invention will become obvious to those skilled in the art upon reading the disclosure set forth hereinafter.
OBJECTS OF INVENTION
Thus, the primary object of the present invention is directed to a system for generating perspective corrected imagery for use in virtual combat training.
Another object of the present invention is directed to a system for providing training during fire under cover process in simulators.
A further object of the present invention is directed to a method of generating perspective corrected imagery for use in virtual combat training during fire under cover process.
It is another object of the present invention, wherein trainee movements are captured using motion tracking device.
It is another object of the present invention, wherein the motion tracking device captures both 3 Degrees of Freedom and 6 Degrees of Freedom [DOF] motion and tracks the movement of the trainee in the immersive virtual world.
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14, 2011
It is another object of the present invention, wherein the trainee's movement and current pose in the real world is tracked using markers.
It is further object of the present invention, wherein the trainee's movement in the real world is tracked using digital image processing technique, the said digital image of the trainee is captured which is then used as reference for the digital model for estimating the pose of the trainee inside the virtual world.
It is another object of the present invention, wherein the system is implemented with software for receiving, storing, processing the data received from the motion tracking device and providing corrected imagery for use in virtual combat training.
It is another object of the present invention, wherein the visual surroundings changes based on the distance traversed and position of the trainee's head.
It is another object of the present invention, wherein the perspective corrected imagery provides position and viewpoint based corrections of the visual surroundings to the trainee to determine the best cover available in the scene for protection.
SUMMARY OF INVENTION
Thus according to the basic aspect of the present invention there is provided a system for generating perspective corrected imagery in virtual combat training station during fire under cover process comprising:
Motion tracking device for tracking head movement of a trainee;
Processing unit; and
One or more display systems,
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14,2011
wherein the motion tracking device further comprising of one or more passive markers positioned on the head or the head gear worn by the trainee,
wherein the motion tracking device captures information on position and orientation of the trainee's head,
wherein the processing unit determines the position and orientation information,
estimates the perspective of the trainee and generates the required imagery according to the calculated perspective of the trainee, and
wherein the display system comprising of one or more projectors and screens visualize the perspective corrected imagery and provides immersive display of 3D virtual environment.
It is another aspect of the present invention, wherein the motion tracking device captures both 3 Degrees of Freedom and 6 Degrees of Freedom motion and tracks the movement of the trainee in the immersive 3D virtual environment.
It is another aspect of the present invention, wherein the display system provides both a moving background scene and a variable three dimensional target image.
It is another aspect of the present invention, wherein the perspective of the trainee is dynamically updated for each set of imagery generated from the 3D virtual environment.
It is another aspect of the present invention, wherein the processing unit is provided with inbuilt software for receiving, storing, processing the information received from the motion tracking device and providing corrected imagery for use in virtual combat training.
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14, 2011
It is another aspect of the present invention, wherein the motion tracking technology is based on optical or infrared cameras or magnetic sensors or any other means for capturing both 3 Degrees of Freedom and 6 Degrees of Freedom motion.
It is another aspect of the present invention, wherein the system provide corrections to the perspective based on position and orientation information of the trainee's head.
It is another aspect of the present invention, wherein the system renders the perspective corrected imagery depending on the position of the trainee.
It is further aspect of the present invention, wherein the system updates the perspective corrected imagery depending on the position of the trainee.
It is another aspect of the present invention, wherein the system updates the position of the trainee depending on the scenario requirements.
It is another aspect of the present invention, wherein the system generates the perspectives of variable three dimensional target objects in virtual environment.
It is another aspect of the present invention, wherein visual surroundings changes based on distance traversed and position of the trainee's head.
In another aspect of the present invention there is provided a method of generating perspective corrected imagery in virtual combat training station during fire under cover process using the said system, the method comprising:
Positioning of passive markers on trainee's head or headgear;
Tracking head or head gear portion of the trainee in real world using motion tracking device with the help of markers;
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14,2011
Capturing information on position and orientation of the trainee's head or headgear;
Sending the captured information to computer by communication means;
Determining the position and orientation information, estimating the perspective of the trainee and generating the required imagery according to the calculated perspective of the trainee through inbuilt software;
Projecting the perspective corrected imagery on the display system; and
Providing immersive display of 3D virtual environment during fire under cover process training,
wherein the perspective of the trainee is dynamically computed for each refresh rate, and wherein the perspective corrected imagery is updated depending on the position of the trainee.
BRIEF DESCRIPTION OF THE DRAWING
Figure 1: Illustrates system according to the present invention.
DETAILED DESCRIPTION OF THE INVENTION WITH REFERENCE TO THE ACCOMPANYING DRAWING
The present invention is directed to a system and method for generating perspective corrected imagery for use in virtual combat training during fire under cover process in simulators. Referring to Figure 1, the system for generating perspective corrected imagery using training simulator comprises of motion tracking device, processing unit and one or more display systems. The motion tracking device further comprises of one or more passive markers positioned on the head or the head gear worn by the trainee. Markers are devices/ combination of devices, which are placed on various parts of the trainee's body and head. The markers could be powered / non-powered. Using these markers, the current pose of the trainee in the real world is tracked. The motion tracking device captures information on position and orientation of the trainee's head.
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14, 2011
Tracking information is to decode trainee's eye position and determine where the trainee is looking and provide required perspective. The display systems further comprises of one or more projector(s) and screen(s). The processing unit is provided with inbuilt software. The processing unit determines the position and orientation information, estimates the perspective of the trainee and generates the required imagery according to the calculated perspective of the trainee. The perspective view is used in the combat scenario as:
a. Trainee hiding behind the rock can peek to have a look at his opponent by taking cover. Software calculates the required perspective for the trainee.
b. A classic example is looking through the window. A trainee looking through the window gets his perspective vision depending on the position and orientation of his head.
This is essential for combat training and to train the soldiers to shoot under cover.
The processing unit with inbuilt software receives, stores, processes the data received from the motion tracking device and providing corrected imagery for use in the virtual combat training. The display system helps to visualize the perspective corrected imagery and provide immersive display of three-dimensional (3D) virtual environment. The projector projects a still picture or a moving background scene and a variable 3D target image for the background scene for view by the trainee.
The motion tracking technology is based on optical or infrared cameras or magnetic sensors or any other means for capturing both 3 Degrees of Freedom and 6 Degrees of Freedom extraction of body posture. The motion tracking device captures both 3 Degrees of Freedom and 6 Degrees of Freedom [DOF] motion and tracks the movement of the trainee in the immersive virtual environment. The trainee's movement in the real world is tracked using digital image processing technique, the said digital image of the trainee is captured which is then used as reference for thedigital model for estimating the pose of the trainee inside the virtual environment. The motion
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14,2011 tracking device generates perspective corrected imagery for training to fire under cover in simulators during virtual combat training operations. The system provides corrections to the perspective based on position and orientation information of the trainee's head. The visual surroundings also changes based on the distance traversed and position of the trainee's head. In other words, the perspective corrected imagery provides position and viewpoint based corrections of the visual surroundings to the trainee to determine the best cover available in the scene for protection. The system of the present invention renders the perspective corrected imagery depending on the position of the trainee. That is, the perspective of the trainee is dynamically updated/ computed for each set of imagery generated from the 3D virtual environment i.e. for each refresh rate. The position of the user is also updated depending on scenario requirements. The system also generates the perspectives of variable three dimensional target objects in virtual environment. A database stores and is used to retrieve the values of the tracking. The system of the present invention can be extended from battle training to various other applications, which involve simulated environment.
A method of generating perspective corrected imagery in virtual combat training station during fire under cover process for 3D virtual environment using the said system is described in detail below.
In a weapon simulator, trainees move inside the immersive virtual environment. In a real world as and when a trainee moves around, the visual surroundings keep changing based on the distance traversed and position of the trainee's head. The trainee's movement in the real world is tracked using both 3 DOF and 6 DOF motions tracking devices with the help of markers.
Essentially markers are placed on the head or headgear of the trainee for tracking the position and orientation of the head of the trainee. In addition, markers are placed on the weapon, which the trainee holds in his/her hand for tracking the weapon position and orientation along with
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14, 2011 point of aim. Markers are also placed on the trainee's shoulders, arms, torso and limb for full body tracking. Using these markers the current pose of the trainee in the real world is tracked using motion tracking device.
Similarly in the virtual world as the trainee moves in the virtual environment, perspective corrected imagery, which provides position and viewpoint based corrections of the visual surroundings to the trainee determines the best cover available in the scene to protect him/her self from the enemy fire, hide behind the cover and fire at the enemy. As the trainee hide behind a cover in the virtual world his line of sight changes according to the generated imagery of the trainee. The trainee would need to determine the most suitable cover in the area based on the trainee's distance from the same, get behind the cover and continue the firing on the enemy using their available line of sight.
The motion tracking device (motion capturing device) records and translates the trainee's movement in the real world on to a digital model using optical/inertial/mechanical motion capture/ electromagnetic technologies. Information on position and orientation of the trainee's head or headgear is captured and sent to a processing unit having inbuilt software, by communication means. The new position coordinates sent to the inbuilt software is used to generate a view frustum in the virtual world, based on the calculations done by the software corresponding to the change in the physical position of the trainee. In other words, the inbuilt software determines the position and orientation information, estimates the perspective of the trainee and generates the required imagery according to the calculated perspective of the trainee.
The perspective corrected imagery is projected on the display system. It is then mapped onto the real world projection screen. View frustums are recomputed dynamically by the software based on the change in the position of the trainees, which is captured by the motion tracking device.
Whenever the trainee changes his/her position inside the simulator, the view frustum is
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14, 2011 recomputed and corresponding imagery is generated taking the trainee's new position into consideration. The generated imagery matches the true perspective of the trainee as seen in real world. The perspective of the trainee is dynamically computed for each refresh rate. The visual surroundings keep changing based on the distance traversed and position of the trainee's head. An immersive display of 3D virtual environment is provided during the fire under cover process training. Stereo goggles may be worn by the trainee to feel the real 3D view of the virtual environment.
The same principle is applied for all projection screens in a multiple screen environment. This provides an extremely realistic real life like scenario during virtual combat training operations. This is most beneficial during tactical training for the trainees.
WE CLAIM:
1. A system for generating perspective corrected imagery in virtual combat training station during fire under cover process comprising:
Motion tracking device for tracking head movement of a trainee;
Processing unit; and
One or more display systems,
wherein the motion tracking device further comprising of one or more passive markers positioned on the head or the head gear worn by the trainee,
wherein the motion tracking device captures information on position and orientation of the trainee's head,
wherein the processing unit determines the position and orientation information, estimates the perspective of the trainee and generates the required imagery according to the calculated perspective of the trainee, and
wherein the display system comprising of one or more projectors and screens visualize the perspective corrected imagery and provides immersive display of 3D virtual environment.
2. A system as claimed in claim 1, wherein the motion tracking device captures both 3 Degrees of Freedom and 6 Degrees of Freedom motion and tracks the movement of the trainee in the immersive 3D virtual environment.
3. A system as claimed in claim 1, wherein the display system provides both a moving background scene and a variable three dimensional target image.
4. A system as claimed in claim 1, wherein the perspective of the trainee is dynamically updated for each set of imagery generated from the 3D virtual environment.
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14, 2011
5. A system as claimed in claim 1, wherein the processing unit is provided with inbuilt software for receiving, storing, processing the information received from the motion tracking device and providing corrected imagery for use in virtual combat training.
6. A system as claimed in anyone of claims 1 to 5, wherein the motion tracking technology is based on optical or infrared cameras or magnetic sensors or any other means for capturing both 3 Degrees of Freedom and 6 Degrees of Freedom motion.
7. A system as claimed in anyone of claims 1 to 6, wherein the system provide corrections to the perspective based on position and orientation information of the trainee's head.
8. A system as claimed in anyone of claims 1 to 7, wherein the system renders the perspective corrected imagery depending on the position of the trainee.
9. A system as claimed in anyone of claims 1 to 8, wherein the system updates the perspective corrected imagery depending on the position of the trainee.
10. A system as claimed in anyone of claims 1 to 9, wherein the system updates the position of the trainee depending on the scenario requirements.
11. A system as claimed in anyone of claims 1 to 10, wherein the system generates the perspectives of variable three dimensional target objects in virtual environment.
12. A system as claimed in anyone of claims 1 to 11, wherein visual surroundings changes based on distance traversed and position of the trainee's head.
13. A method of generating perspective corrected imagery in virtual combat training station during fire under cover process using the system as claimed in anyone of claims 1 to 12 comprising:
Positioning of passive markers on trainee's head or headgear;
NAME OF THE APPLICANT: VIRTUAL LOGIC SYSTEMS PRIVATE LTD PROVISIONAL SPECIFICATION: 4384/CHE/2011 dated December 14, 2011
Tracking head or head gear portion of the trainee in real world using motion tracking device with the help of markers;
Capturing information on position and orientation of the trainee's head or headgear;
Sending the captured information to computer by communication means;
Determining the position and orientation information, estimating the perspective of the trainee and generating the required imagery according to the calculated perspective of the trainee through inbuilt software;
Projecting the perspective corrected imagery on the display system; and
Providing immersive display of 3D virtual environment during fire under cover process training,
wherein the perspective of the trainee is dynamically computed for each refresh rate, and
wherein the perspective corrected imagery is updated depending on the position of the trainee.
| # | Name | Date |
|---|---|---|
| 1 | 4384-CHE-2011 POWER OF ATTORNEY 14-12-2011.pdf | 2011-12-14 |
| 1 | abstract4384-CHE-2011.jpg | 2013-03-01 |
| 2 | 4384-CHE-2011 CORRESPONDENCE OTHERS. 06-12-2012.pdf | 2012-12-06 |
| 2 | 4384-CHE-2011 FORM-3 14-12-2011.pdf | 2011-12-14 |
| 3 | 4384-CHE-2011 FORM-2 14-12-2011.pdf | 2011-12-14 |
| 3 | 4384-CHE-2011 DRAWINGS 06-12-2012.pdf | 2012-12-06 |
| 4 | 4384-CHE-2011 FORM-1 14-12-2011.pdf | 2011-12-14 |
| 4 | 4384-CHE-2011 FORM-1 06-12-2012.pdf | 2012-12-06 |
| 5 | 4384-CHE-2011 DRAWINGS 14-12-2011.pdf | 2011-12-14 |
| 5 | 4384-CHE-2011 FORM-2 06-12-2012.pdf | 2012-12-06 |
| 6 | 4384-CHE-2011 DESCRIPTION (PROVISIONAL) 14-12-2011.pdf | 2011-12-14 |
| 6 | 4384-CHE-2011 FORM-5 06-12-2012.pdf | 2012-12-06 |
| 7 | 4384-CHE-2011 CORRESPONDENCES 14-12-2011.pdf | 2011-12-14 |
| 7 | 4384-CHE-2011 POWER OF ATTORNEY 06-12-2012.pdf | 2012-12-06 |
| 8 | 4384-CHE-2011 ABSTRACT 06-12-2012.pdf | 2012-12-06 |
| 8 | 4384-CHE-2011 CLAIMS 06-12-2012.pdf | 2012-12-06 |
| 9 | 4384-CHE-2011 DESCRIPTION(COMPLETE) 06-12-2012.pdf | 2012-12-06 |
| 10 | 4384-CHE-2011 CLAIMS 06-12-2012.pdf | 2012-12-06 |
| 10 | 4384-CHE-2011 ABSTRACT 06-12-2012.pdf | 2012-12-06 |
| 11 | 4384-CHE-2011 CORRESPONDENCES 14-12-2011.pdf | 2011-12-14 |
| 11 | 4384-CHE-2011 POWER OF ATTORNEY 06-12-2012.pdf | 2012-12-06 |
| 12 | 4384-CHE-2011 DESCRIPTION (PROVISIONAL) 14-12-2011.pdf | 2011-12-14 |
| 12 | 4384-CHE-2011 FORM-5 06-12-2012.pdf | 2012-12-06 |
| 13 | 4384-CHE-2011 DRAWINGS 14-12-2011.pdf | 2011-12-14 |
| 13 | 4384-CHE-2011 FORM-2 06-12-2012.pdf | 2012-12-06 |
| 14 | 4384-CHE-2011 FORM-1 14-12-2011.pdf | 2011-12-14 |
| 14 | 4384-CHE-2011 FORM-1 06-12-2012.pdf | 2012-12-06 |
| 15 | 4384-CHE-2011 FORM-2 14-12-2011.pdf | 2011-12-14 |
| 15 | 4384-CHE-2011 DRAWINGS 06-12-2012.pdf | 2012-12-06 |
| 16 | 4384-CHE-2011 FORM-3 14-12-2011.pdf | 2011-12-14 |
| 16 | 4384-CHE-2011 CORRESPONDENCE OTHERS. 06-12-2012.pdf | 2012-12-06 |
| 17 | abstract4384-CHE-2011.jpg | 2013-03-01 |
| 17 | 4384-CHE-2011 POWER OF ATTORNEY 14-12-2011.pdf | 2011-12-14 |