Sign In to Follow Application
View All Documents & Correspondence

System And Method For Engaging Targets Under All Weather Conditions Using Head Mounted Device

Abstract: ABSTRACT The present invention relates to a system and method for providing virtual sigh t as aiming aid to a weapon to engage targets under all weather conditions using mixed reality head mounted device (HMD). The system and method provide a peripheral target observation device like radar/lidar 5 that transmits tactical data to a target data receiver (TDR). Here, TDR process the received tactical data and trajectory of selected target is transformed from TDR’s frame of reference to that of HMD . The operator of weapon system, who is also wearer of HMD is provided with situational cues along with a virtual target over HMD display that assists the operator 10 in accurate aiming and locking of intended target. [Figure 2]

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 June 2021
Publication Number
17/2024
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
Parent Application

Applicants

DIMENSION NXG PRIVATE LIMITED
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India

Inventors

1. Abhishek Tomar
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
2. Pankaj Raut
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
3. Abhijit Patil
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
4. Yukti Suri
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
5. Shantanu Barai
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
6. Prathamesh Tugaonkar
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India

Specification

DESC:FIELD OF THE INVENTION
Embodiments of the present invention relate to mixed reality
based
head mounted device and more particularly to a system and a method for
using an Augmented Reality/Mixed Reality headset or glasses while
aiming a target using handheld firearms or different surface to air missile 5 systems or cannons/autocannons under all weather conditions
BACKGROUND OF THE INVENTION
Strong armed forces are o
ne of the important aspects for the
development and growth of any country. The armed forces participate in
peacekeeping operations, military exercises and humanitarian relief 10 missions. They also carry out more traditional tasks such as securing the
borders . Armed forces use weapons for the benefit of civilians including
law enforcement officers and members of the Army, Navy, Air Force, and
Marines. In order to use a weapon effectively, a person must be able to
accurately aim at a target. To accurately engag e targets, the strike of a 15 bullet must coincide with the aiming point (Point of Aim/Point of Impact) on
the target. Over the years, various techniques and devices have been
developed to help a person accurately aim a weapon.
One common approach is to mount
a sight on the weapon to view
the intended target. The sight must be zeroed before being used on the 20 weapon. A true zeroing a firearm, such as a rifle, is the process of aligning
and calibrating the optical sight on the weapon with the rifle so the user
c an accurately aim at the target from a set distance. It is based on the
rationale that any individual differences in sight alignment for each
individual will be eliminated by correcting the sight for the individual firing 25 the weapon . Precisely , the user mu st calibrate the user's optical sights to
their weapons to eliminate all eccentricities and ensure the user will hit the
intended target. Typically, this is accomplished by adjusting the sights on
the weapon such that when a round is fired from the weapon, it hits the
3
aiming point within the margin of error of weapon
aiming point within the margin of error of weapon. This zeroing process is . This zeroing process is one of the most critical elements of accurate target engagement. In order one of the most critical elements of accurate target engagement. In order to aim accurately at a target, it is imperative for the sighting mechanism to to aim accurately at a target, it is imperative for the sighting mechanism to be properly instbe properly installed and adjusted on the gun. alled and adjusted on the gun. Having incorrect sight Having incorrect sight alignmentalignment leads to inaccurate firings that may eventually have negative leads to inaccurate firings that may eventually have negative 5 training impacts besides incurring huge time, cost and ammunition loss. training impacts besides incurring huge time, cost and ammunition loss.
One of the existing challenges in s
One of the existing challenges in sighting in or zeroing a firearm ighting in or zeroing a firearm is parallax is parallax correction correction in the sighting system. Tin the sighting system. The ghe goal oal is reducing the parallax andis reducing the parallax and adjusting the sights so the projectile (adjusting the sights so the projectile (e.g.,e.g., bullet or shell) may be placed at bullet or shell) may be placed at a predictable impact position within the sight picture. The principle is to a predictable impact position within the sight picture. The principle is to 10 shift the line of aim soshift the line of aim so thatthat it intersects the parabolic projectilit intersects the parabolic projectile trajectory at e trajectory at a designated point of reference, known as a zero, so the gun will a designated point of reference, known as a zero, so the gun will repeatably hit where it aims at the distance of that "zero" point. repeatably hit where it aims at the distance of that "zero" point.
Conventional methods of zeroing a
Conventional methods of zeroing a firearmfirearm are often expensive and time are often expensive and time consumingconsuming where multiple rounds of where multiple rounds of live live ammunition ammunition may be fired to may be fired to 15 achieve desired accuracy achieve desired accuracy besides assuring receiving ofbesides assuring receiving of detailed detailed instructionsinstructions from skilled professionalsfrom skilled professionals. This indeed is a time and cost . This indeed is a time and cost intensive training, especially when a large group of armed personnel or intensive training, especially when a large group of armed personnel or defence forcesdefence forces nneed to be trainedeed to be trained.. Additionally, there is always a burden of Additionally, there is always a burden of employing a skilled professional adept at handling such sophisticated employing a skilled professional adept at handling such sophisticated 20 weapon systems at all times. weapon systems at all times. More importantly, real challenge persists in More importantly, real challenge persists in situations of unfavourable weather conditionssituations of unfavourable weather conditions where visibility of aeriwhere visibility of aerial al target is impaired for target is impaired for variablevariable reasonsreasons-- fog, smog, cloud, rain, fog, smog, cloud, rain, snow etc. snow etc. Locating and engaging threating targets under these disadvantageous Locating and engaging threating targets under these disadvantageous weather conditions weather conditions is critical and plays an instrumental role in is critical and plays an instrumental role in factoringfactoring 25 missionmission successsuccess..
In the background
In the background of foregoing limitations,of foregoing limitations, there there exists aexists a need in the art for need in the art for a system and method a system and method that can effectively align the weapon system with that can effectively align the weapon system with line of sight of operator line of sight of operator and engage targets with acute precision in real and engage targets with acute precision in real time under all weather conditionstime under all weather conditions. . Preferably, some kind of virtualPreferably, some kind of virtual aid in aid in 30
4
the form of cues for
the form of cues for locatinglocating and actual engagement of and actual engagement of the the intended intended aerial aerial target target will be will be extremely vitalextremely vital for operators in dynamic situations like that of for operators in dynamic situations like that of a battlefield. Therefore, a battlefield. Therefore, an effective and viable system and an effective and viable system and method formethod for displaying displaying situational and directional situational and directional pointers forpointers for target sighting, tracking, target sighting, tracking, locking and engagementlocking and engagement is desiredis desired that does not suffer from abovethat does not suffer from above--5 mentioned deficiencies.mentioned deficiencies.
OBJECT OF THE INVENTION
OBJECT OF THE INVENTION
An object of the present invention is to
An object of the present invention is to providprovide ae a virtual sight as aiming aid virtual sight as aiming aid while using a weaponry systemwhile using a weaponry system that can assist in engaging targets in real that can assist in engaging targets in real time under all weather conditionstime under all weather conditions.. 10
Another object of the present invention is to provide a head mounted
Another object of the present invention is to provide a head mounted display that assists the operator in providing a display that assists the operator in providing a virtual sightvirtual sight along with along with situational and directional cues for aiming and locking the intended aerial situational and directional cues for aiming and locking the intended aerial target.target.
An object of the present invention is to provide a system to provide virtual
An object of the present invention is to provide a system to provide virtual 15 aid in the form of cues for finding the aerial target and aiding in aaid in the form of cues for finding the aerial target and aiding in actual ctual engagement with the target engagement with the target by by displayingdisplaying virtual virtual lasers andlasers and virtual virtual crosshair crosshair pointerpointers for target sighting, tracking, locking and engagements for target sighting, tracking, locking and engagement..
Another object of the present invention is to
Another object of the present invention is to reduce aiming errors and reduce aiming errors and enhance shooting efficacyenhance shooting efficacy as visual aas visual and audio aid is provided to operator nd audio aid is provided to operator 20 while using his weaponry system to aim at the targetwhile using his weaponry system to aim at the target..
In another object of the present invention, significant reduction in
In another object of the present invention, significant reduction in training training time and ammunition costtime and ammunition cost is achieved as operator of the weaponry system is achieved as operator of the weaponry system is provisioned is provisioned with visual/audio cues for locking the target.with visual/audio cues for locking the target.
In yet another embodiment of the present invention, the mixed reality
In yet another embodiment of the present invention, the mixed reality 25 based head mounted display fbased head mounted display facilitate depiction of entire surveillance acilitate depiction of entire surveillance picture picture along with enhanced analyticsalong with enhanced analytics to the operator to the operator for right selection for right selection and neutralization of the and neutralization of the probable aerial probable aerial targettargetss..
5
SUMMARY OF THE INVENTION
SUMMARY OF THE INVENTION
The present invention may satisfy one or more of the above
The present invention may satisfy one or more of the above-- mmentioned desirable features. Other features and/or advantages my entioned desirable features. Other features and/or advantages my become apparent from the become apparent from the description which follows.description which follows.
In one aspect of the present disclosure, a
In one aspect of the present disclosure, a system for tracking and system for tracking and 5 locking one or more targets, characterized in utilizing a wearable mixed locking one or more targets, characterized in utilizing a wearable mixed reality based head mounted display (HMD)reality based head mounted display (HMD) is disclosed. Accordingly,is disclosed. Accordingly, the the system comprissystem comprises es a peripheral target observation device configured to a peripheral target observation device configured to obtain tactical data of the one or more targetsobtain tactical data of the one or more targets; ; a target data receiver a target data receiver (TDR) configured to(TDR) configured to receive and process the tactical data of the one or receive and process the tactical data of the one or 10 more targets received from the peripheral target observatiomore targets received from the peripheral target observation devicen device,, andand select at least one of the one or more targets based on a plurality of select at least one of the one or more targets based on a plurality of predetermined factorspredetermined factors. .
The system further comprises of
The system further comprises of a computing module a computing module communicatively coupled with the TDR configured to receive and process communicatively coupled with the TDR configured to receive and process 15 trajectory data of the setrajectory data of the selected target and determine correction data from lected target and determine correction data from thethe TDR’s frame of reference to HMD’s frame of reference for transmission TDR’s frame of reference to HMD’s frame of reference for transmission to a processing moduleto a processing module. Now, the processing module is configured to . Now, the processing module is configured to perform coordinate transformation of the tactical data such that perform coordinate transformation of the tactical data such that trajectory trajectory of the selected target is transformed from the TDR’s frame of reference to of the selected target is transformed from the TDR’s frame of reference to 20 that of HMD’s frame of reference. In one significant aspect, the wearable that of HMD’s frame of reference. In one significant aspect, the wearable mixed reality based HMD is configured to render a virtual target based on mixed reality based HMD is configured to render a virtual target based on the transformed trajecthe transformed trajectory data, and overlay the virtual target on the tory data, and overlay the virtual target on the selected target in a three dimensional space. Finally, the system includes selected target in a three dimensional space. Finally, the system includes a weapon system having a frame of reference aligned with that of the a weapon system having a frame of reference aligned with that of the 25 HMD’s frame of reference, wherein the weapon system is manoeuvrHMD’s frame of reference, wherein the weapon system is manoeuvred ed such that a virtual reticle emanating from the weapon system and viewed such that a virtual reticle emanating from the weapon system and viewed from the HMD, coincides with the virtual target for accurate aiming at the from the HMD, coincides with the virtual target for accurate aiming at the selected target so overlaid with the virtual target. selected target so overlaid with the virtual target.
6
In another aspect of the present invention, the
In another aspect of the present invention, the methmethod for tracking and od for tracking and locking one or more targets locking one or more targets is disclosedis disclosed,, which iswhich is characterized in utilizing characterized in utilizing a wearable mixed reality based head mounted display (HMD)a wearable mixed reality based head mounted display (HMD). T. The method he method comprises at first comprises at first obtaining tactical data of the one or more targets from a obtaining tactical data of the one or more targets from a peripheral target observation device and transmitting the tactical data to a peripheral target observation device and transmitting the tactical data to a 5 target data receiver (TDR)target data receiver (TDR). Next, the tactical data of the one or more . Next, the tactical data of the one or more targets is targets is receivreceiveded and processand processeded at the TDR at the TDR andand at least one of the one at least one of the one or more targets or more targets are selected are selected based based on a plurality of predetermined factorson a plurality of predetermined factors. . Now, at a computing module communicatively coupled with the TDR, Now, at a computing module communicatively coupled with the TDR, trajectory data of the selected target is received and processed, whereby trajectory data of the selected target is received and processed, whereby 10 correction data from TDR’s frame of reference to HMD’s frame of correction data from TDR’s frame of reference to HMD’s frame of reference isreference is determined for transmission to a processing module. determined for transmission to a processing module. At the At the processing module, coordinate transformation of the tactical data is processing module, coordinate transformation of the tactical data is performed such that trajectory of the selected target is transformed from performed such that trajectory of the selected target is transformed from the TDR’s frame of reference to that of HMD’s frame othe TDR’s frame of reference to that of HMD’s frame of reference. f reference. 15
Following step involves
Following step involves rendering over a wearable mixed reality based rendering over a wearable mixed reality based HMDHMD, , a virtual target based on the transformed trajectory data and a virtual target based on the transformed trajectory data and overlaying the virtual target on the selected target in a three dimensional overlaying the virtual target on the selected target in a three dimensional spacespace. Finally, the method st. Finally, the method steps involve eps involve aligning frame of reference of a aligning frame of reference of a weapon system with that of the HMD’s frame of reference, and weapon system with that of the HMD’s frame of reference, and 20 manoeuvring the weapon system such that a virtual reticle emanating from manoeuvring the weapon system such that a virtual reticle emanating from the weapon system 600 and viewed from the HMD, coincides with the the weapon system 600 and viewed from the HMD, coincides with the virtual tarvirtual target for accurate aiming at the selected target so overlaid with the get for accurate aiming at the selected target so overlaid with the virtual targetvirtual target..
In one of the most significant aspect of present disclosure, the
In one of the most significant aspect of present disclosure, the 25 aforementioned system and method are capable of locking the targets in aforementioned system and method are capable of locking the targets in real time irrespective of prevailingreal time irrespective of prevailing weather conditions.weather conditions.
Details of one or more implementations of the subject matter
Details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will and the description below. Other features, aspects, and advantages will 30
7
become apparent from the de
become apparent from the description, the drawings, and the claims. scription, the drawings, and the claims. Neither this summary nor the following detailed description purports to Neither this summary nor the following detailed description purports to define or limit the scope of the inventive subject matter.define or limit the scope of the inventive subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
BRIEF DESCRIPTION OF THE DRAWINGS 5
The skilled artisan will understand that the
The skilled artisan will understand that the drawings described drawings described below are for illustrative purposes only. The drawings are not intended to below are for illustrative purposes only. The drawings are not intended to limit the scope of the present teachings in any way.limit the scope of the present teachings in any way.
Fig. 1 illustrates a block diagram of the system providing
Fig. 1 illustrates a block diagram of the system providing virtual virtual sight as aiming aid to a sight as aiming aid to a weapon using weapon using headhead mounted devicemounted device, in , in 10 accordance with examples of the present inventioaccordance with examples of the present invention.n.
Fig. 2 illustrat
Fig. 2 illustrates offset correctiones offset correction between the Target Data Receiver between the Target Data Receiver (TDR) with respect to the Head mounted device (HMD), in accordance (TDR) with respect to the Head mounted device (HMD), in accordance with examples of the present inventionwith examples of the present invention..
Fig. 3
Fig. 3 displays tactical information rendered as a hollow blip over displays tactical information rendered as a hollow blip over 15 the Head mounted device (HMD), the Head mounted device (HMD), in accordance with in accordance with one exemplary one exemplary embodimentembodiment of the present inventionof the present invention..
DETAILED DESCRIPTION OF
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS OF PREFERRED EMBODIMENTS OF THE INVENTIONTHE INVENTION
In the following discussion that addresses a number of
In the following discussion that addresses a number of 20 embodiments and applications of the present embodiments and applications of the present disclosuredisclosure, reference is , reference is made to the accompanying drawings that form a part hereof, and in which made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments in whicis shown by way of illustration specific embodiments in which the h the disclosuredisclosure may be practiced. It is to be understood that other may be practiced. It is to be understood that other embodiments may be utilized and changes may be made without embodiments may be utilized and changes may be made without 25 departing from the scope of the present departing from the scope of the present inventioninvention..
8
Various inventive features are described below that can each be
Various inventive features are described below that can each be used used independently of one another or in combination with other features. independently of one another or in combination with other features. However, any single inventive feature may not address any of the However, any single inventive feature may not address any of the problems discussed above or only address one of the problems discussed problems discussed above or only address one of the problems discussed above. Further, one or more of the problems discusseabove. Further, one or more of the problems discussed above may not be d above may not be 5 fully addressed by any of the features described below.fully addressed by any of the features described below.
As used herein, the singular forms “a”, “an” and “the” include plural
As used herein, the singular forms “a”, “an” and “the” include plural referents unless the context clearly dictates otherwise. “And” as used referents unless the context clearly dictates otherwise. “And” as used herein is interchangeably used with “or”herein is interchangeably used with “or” unless expressly stated otherwise.unless expressly stated otherwise. Likewise, other terms such as “target or object”Likewise, other terms such as “target or object”, “, “head display or wearable head display or wearable 10 device or head mounted display or mixed reality based head mounted device or head mounted display or mixed reality based head mounted displaydisplay”” may be interchangeably used.may be interchangeably used. All embodiments of any aspect of All embodiments of any aspect of the inthe invention can be used in combination, unless the context clearly vention can be used in combination, unless the context clearly dictates otherwise.dictates otherwise.
Unless the context clearly requires otherwise, throughout the
Unless the context clearly requires otherwise, throughout the 15 description and the claims, the words ‘comprise’, ‘comprising’, and the like description and the claims, the words ‘comprise’, ‘comprising’, and the like are to be construed in an inclusivare to be construed in an inclusive sense as opposed to an exclusive or e sense as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited exhaustive sense; that is to say, in the sense of “including, but not limited to”. Words using the singular or plural number also include the plural and to”. Words using the singular or plural number also include the plural and singular number, respectively. Additionally, the words “herein,” singular number, respectively. Additionally, the words “herein,” “wherein”, “wherein”, 20 “whereas”, “above,” and “below” and words of similar import, when used in “whereas”, “above,” and “below” and words of similar import, when used in this application, shall refer to this application as a whole and not to any this application, shall refer to this application as a whole and not to any particular portions of the application.particular portions of the application.
The description of embodiments of the disclosure is not
The description of embodiments of the disclosure is not intended to intended to be exhaustive or to limit the disclosure to the precise form disclosed. While be exhaustive or to limit the disclosure to the precise form disclosed. While 25 the specific embodiments of, and examples for, the disclosure are the specific embodiments of, and examples for, the disclosure are described herein for illustrative purposes, various equivalent modifications described herein for illustrative purposes, various equivalent modifications are possible within thare possible within the scope of the disclosure, as those skilled in the e scope of the disclosure, as those skilled in the relevant art will recognizerelevant art will recognize..
9
Broadly,
Broadly, Fig. 1Fig. 1 sschematically and graphicallychematically and graphically illustrateillustrate a system and a system and method method for for tracking and locking one or more targets, characterizedtracking and locking one or more targets, characterized in in utilizing a wearable mixed reality based head mounted display (HMD)utilizing a wearable mixed reality based head mounted display (HMD) that that providesprovides virtual sight as virtual sight as an an aiming aid to a weaponaiming aid to a weaponry system.ry system. In one In one of the of the exemplaryexemplary embodimentsembodiments of the present disclosure, of the present disclosure, a system and method a system and method 5 that that provides virtual sight as an provides virtual sight as an aiming aid to a weaponaiming aid to a weaponry systemry system is is proposed that employsproposed that employs aann externalexternal, smart , smart wearablewearable head mounted displayhead mounted display to to enable the operator of the weaponry system or wearer of the HMD enable the operator of the weaponry system or wearer of the HMD achieveachieve the intended purpose of accurate shot placement. the intended purpose of accurate shot placement. Accordingly, the Accordingly, the mixed reamixed realitylity based HMD is based HMD is builtbuilt with capabilities of providingwith capabilities of providing virtual aid in virtual aid in 10 the form of the form of vital vital cues for cues for locatinglocating the aerial target and aiding in actual the aerial target and aiding in actual engagement engagement of the weapon of the weapon with the target for accurate aimingwith the target for accurate aiming and firingand firing..
R
Ree--referring to Fig. 1,referring to Fig. 1, aa block diagram of proposedblock diagram of proposed systemsystem 10001000 is is presented, wherein the system 1000presented, wherein the system 1000 comprises of a peripheral comprises of a peripheral object/target observation deviceobject/target observation device 100100, , one or moreone or more aerial targetaerial targetss (to be (to be 15 aimedaimed atat)) 5050a, a, 5050b, b, 5050c,…c,…50n (50n (singularly referred to by numeral 50singularly referred to by numeral 50)), a , a target data receivertarget data receiver (TDR)(TDR) 200200, a smart mixed reality based head mounted , a smart mixed reality based head mounted devicedevice (HMD)(HMD) 500500 wearable by the operatorwearable by the operator, and a weapon, and a weapon system system 6600 00 operable by the operatoroperable by the operator who is who is aiming the weaponaiming the weapon 600600 totowards the wards the stationary or movingstationary or moving aerialaerial target 50target 50 for final accurate shotfor final accurate shot. . 20
In accordance with one exemplary embodiment, the
In accordance with one exemplary embodiment, the tactical data tactical data from external peripheral object observation device from external peripheral object observation device 100 100 such as radar/lidar such as radar/lidar 100 is received by Target Data Receiver (TDR) 100 is received by Target Data Receiver (TDR) 200200. At TDR 200, the . At TDR 200, the tactical data is receivetactical data is received and processed. In one preferred embodiment, the d and processed. In one preferred embodiment, the TDR 200 is communicatively coupled with a computing module 250 where TDR 200 is communicatively coupled with a computing module 250 where 25 the target trajectory data is received and processed. the target trajectory data is received and processed. A processing module A processing module 560, 560, provided provided preferablpreferablyy at operator’s end then performs at operator’s end then performs coordinate coordinate transformation of the tactical data such that trajectory of the target is transformation of the tactical data such that trajectory of the target is transformed from TDR’s frame of reference to that of HMD’s frame of transformed from TDR’s frame of reference to that of HMD’s frame of reference.reference. In one exemplary embodiment, a In one exemplary embodiment, a complete surveillance picturcomplete surveillance picture e 30
10
and precise position of
and precise position of one or more one or more aerial targetaerial targetss 5050 as virtual targets (as as virtual targets (as explained later)explained later) may be rendered over HMD 500 may be rendered over HMD 500 for absolute aiming and for absolute aiming and firing. firing. Finally, a weapon system 600 configured at operator’s end is Finally, a weapon system 600 configured at operator’s end is manoeuvred such that a virtual reticlemanoeuvred such that a virtual reticle emanates from weapon system 600 emanates from weapon system 600 and is and is made made viewable via HMD 500 to the operatorviewable via HMD 500 to the operator. This virtual reticle is . This virtual reticle is 5 eventually made to coincide with a virtual target rendered over the HMD eventually made to coincide with a virtual target rendered over the HMD 500 500 for accurate aiming and locking of the targetfor accurate aiming and locking of the target 50.50.
In
In accordance withaccordance with explanatory embodiment, the external explanatory embodiment, the external peripheral object observation device 100 is configured to receive a peripheral object observation device 100 is configured to receive a reflected wave of the irradiated radar wave from the aerial targetreflected wave of the irradiated radar wave from the aerial target 5050 10 existing at the irradiation destination.existing at the irradiation destination. IIn one preferred embodimn one preferred embodiment, tent, the he peripheral object observation device (radar)peripheral object observation device (radar) 100100 repeatedly observes a repeatedly observes a relative position of an relative position of an aerial targetaerial target 5050 within within anan enhanced operationalenhanced operational picture picture and transmits the data to and transmits the data to TargetTarget Data Receiver (TDR)Data Receiver (TDR) 200200 over a over a radio communicationradio communication.. In one illustration, In one illustration, the peripheral object observation the peripheral object observation 15 device 100 device 100 provides the coordinates of the aerial targetprovides the coordinates of the aerial target 5050 in the spherical in the spherical coordinate systemcoordinate system in the in the peripheral object observation device’s 100 peripheral object observation device’s 100 frame frame of referenceof reference. .
The c
The coordinates received by the oordinates received by the TargetTarget Data Receiver (TDR) 200 Data Receiver (TDR) 200 from the peripheral object observation device 100 is ifrom the peripheral object observation device 100 is in additionn addition to otherto other 20 encoded encoded critical datacritical data that that is later processedis later processed. In accordance with one . In accordance with one example embodiment, at TDR 200 at least one target 50 of the one or example embodiment, at TDR 200 at least one target 50 of the one or more targets is selected based on a plurality of factors, including, but not more targets is selected based on a plurality of factors, including, but not limited to limited to priority and estimation (PSQR) of target, priority and estimation (PSQR) of target, target launch direction, target launch direction, target ttarget type, speed and distance of the target, ype, speed and distance of the target, target trajectory, target trajectory, lethality lethality 25 levels and the likelevels and the like. .
Further, at TDR 200
Further, at TDR 200, the tactical data is processed and decoded to , the tactical data is processed and decoded to extract extract a unique identifier information of thea unique identifier information of the selected targetselected target 5050 assigned assigned thereto by thereto by peripheral object observation device 100peripheral object observation device 100, radial distance, radial distance andand azimuthal angle of the target azimuthal angle of the target 50 50 from the from the peripheral object observation peripheral object observation 30
11
device 100
device 100, perpendicular height of the target , perpendicular height of the target 50 50 from the ground plane or from the ground plane or base, heading angle of the base, heading angle of the targettarget 5050, speed of the target, speed of the target 5050, , IFF IFF (identification of friend or foe) code of target(identification of friend or foe) code of target 5050, WCO (weapon control , WCO (weapon control orders) code of the targetorders) code of the target determined/computed by determined/computed by peripheral object peripheral object observation device 100observation device 100, , advanced target positionadvanced target position, orientation, orientation andand relative relative 5 velocityvelocity of the targetof the target 5050 approaching the firing zone. approaching the firing zone. In one example In one example embodiment, the tactical data is decoded from a hexadecimal byte string embodiment, the tactical data is decoded from a hexadecimal byte string to extract the critical parametric values (as stated above) with respect to to extract the critical parametric values (as stated above) with respect to thethe (selected)(selected) target 50. target 50.
T
Thehe coordinate data of the target 50 along with other critical coordinate data of the target 50 along with other critical datadata 10 received at the TDRreceived at the TDR 200200 is processed, decoded and is processed, decoded and wirelesslywirelessly transmittedtransmitted to a smart mixed reality (MR) based head mounted display (HMD)to a smart mixed reality (MR) based head mounted display (HMD) 500500 from where the operator designatefrom where the operator designatess thethe virtualvirtual target.target. In a preferred In a preferred embodiment, TDR 200 is provisioned with a cembodiment, TDR 200 is provisioned with a computing moduleomputing module 250 250 connected thereto over a wired or wireless connection.connected thereto over a wired or wireless connection. 15
In an embodiment, the
In an embodiment, the tracking tracking information with respect to the one information with respect to the one or more intended or locked targets is continuously received at or more intended or locked targets is continuously received at computing computing module 250module 250 in real time at an intermittent gap of approx. 1in real time at an intermittent gap of approx. 1--3 secs. 3 secs. However, for the fastHowever, for the fast--moving dynamic moving dynamic air borne air borne threats/targetsthreats/targets such as such as highlyhighly manoeuvrable aircraft, supersonic cruise missiles, tactical and manoeuvrable aircraft, supersonic cruise missiles, tactical and 20 strategic ballistic missile restrategic ballistic missile re--entry vehicles, entry vehicles, other unmanned aerial vehicles other unmanned aerial vehicles (UAVs) (UAVs) etc.etc. a time gap of 1a time gap of 1--3 sec is of 3 sec is of invaluableinvaluable significancesignificance. The reason . The reason being tbeing thathat the target 50 the target 50 position will be displaced from its predicted position will be displaced from its predicted position by a considerable distance (say, to an extent of a mile) within few position by a considerable distance (say, to an extent of a mile) within few seconds, which may bring immense uncertainty in target localization, seconds, which may bring immense uncertainty in target localization, 25 particularly for target engagementparticularly for target engagement..
In
In somesome specific embodiments, forspecific embodiments, for e.g. e.g. military or warfare military or warfare applications applications such an unpredictable nature of target displacementsuch an unpredictable nature of target displacement by aby a mile mile or more may prove disastrous if not effectively monitored and tracked. or more may prove disastrous if not effectively monitored and tracked. Hence, continuous tracking and tracing of these targets Hence, continuous tracking and tracing of these targets inin relatively short relatively short 30
12
time frame (e.g. seconds)
time frame (e.g. seconds) becomes quintessential in such war like becomes quintessential in such war like scenarios. In a preferred embodiment, the present disclosure attempts to scenarios. In a preferred embodiment, the present disclosure attempts to provide a solution by way of predicting position of such threatening targets provide a solution by way of predicting position of such threatening targets continually and accuracontinually and accurately throughout for exact aimingtely throughout for exact aiming, interception and , interception and destructiondestruction of the weapon. of the weapon. 5
In an example scenario, dynamics of two warfare weapon
In an example scenario, dynamics of two warfare weaponss are are explained for clarity and understanding. To strike the target on the first explained for clarity and understanding. To strike the target on the first shot, the shot, the guided or unguided guided or unguided trajectory oftrajectory of the the weapon systemweapon system must be must be accurately controlled. This is very much dependent on knowing the exact accurately controlled. This is very much dependent on knowing the exact or probable position of the target threat or its deviation from a or probable position of the target threat or its deviation from a 10 predetermined course.predetermined course. For sake of simplicity and explanation, let’s For sake of simplicity and explanation, let’s consider two kinds of wconsider two kinds of weapon system:eapon system:
A) Guided Airborne Ranged WeaponsGuided Airborne Ranged Weapons: : The popular example that The popular example that belongs to this warfare system is a missile (also referred as guided belongs to this warfare system is a missile (also referred as guided rocket) that is capable of rocket) that is capable of selfself--propelledpropelled flightflight based on computation of based on computation of 15 changes in position, velocity, altitude, anchanges in position, velocity, altitude, and/or rotation rates of a d/or rotation rates of a moving target and/or altitude profile based on information about the moving target and/or altitude profile based on information about the target’s state of motion.target’s state of motion. The missile is guided on to its target based The missile is guided on to its target based on missile's current position and the position of the target, and on missile's current position and the position of the target, and computation of course becomputation of course between them to acquire the targettween them to acquire the target.. 20
B) Unguided WeaponsUnguided Weapons: : This This refers to any freerefers to any free--flight missile type having flight missile type having no inbuilt guidance systemno inbuilt guidance system e.g. rocketse.g. rockets.. For such systems, For such systems, instructions have to be relayed based on commands transmitted from instructions have to be relayed based on commands transmitted from the launch platform for the launch platform for course correction.course correction.
To achieve
To achieve accurate course and target locationaccurate course and target location, here, at the , here, at the 25 computing module 250, the trajectory data of the selected target 50 is computing module 250, the trajectory data of the selected target 50 is received and processed in real time for trajectory path synthesisreceived and processed in real time for trajectory path synthesis via via interpolation that enables the operator to fire at any given point of time interpolation that enables the operator to fire at any given point of time without having to wait for without having to wait for knowing the knowing the exact locationexact location of target from radar of target from radar
13
100
100. The target trajectory path synthesised at the computing module 250 . The target trajectory path synthesised at the computing module 250 provides the operator with a provides the operator with a confident position of the target 50 at any confident position of the target 50 at any instance thereby making the overall operation of target aiming and firing instance thereby making the overall operation of target aiming and firing more precise and accurate. more precise and accurate. Further,Further, as will be acknowledged by those as will be acknowledged by those operating at field level, hit rate in such dynamic war like situationsoperating at field level, hit rate in such dynamic war like situations is not is not 5 very optimal. However, with the virtual aid of the present system 1000, the very optimal. However, with the virtual aid of the present system 1000, the target probable position can be predicted few milliseconds ahead of time target probable position can be predicted few milliseconds ahead of time that ensures that ensures more directed aiming and hitting of the target 50. more directed aiming and hitting of the target 50.
In one illustrative embodiment,
In one illustrative embodiment, the target trajectory paththe target trajectory path is is interpolatedinterpolated and future pathand future path is predictedis predicted by the computing module 250by the computing module 250. . 10 TThe interpolation and prediction of the target trajectory is based on he interpolation and prediction of the target trajectory is based on historical data of track traced by the historical data of track traced by the selected target 50selected target 50 from from instance of its instance of its detection by the peripheral target observation device 100detection by the peripheral target observation device 100. The selected . The selected target 50 is observed for its flight path which is influenced by numerous target 50 is observed for its flight path which is influenced by numerous factors. The parameter values representative of target’s inherent factors. The parameter values representative of target’s inherent 15 aeroaerodynamics (mass, moment of inertia, drag coefficients etc.), geometry, dynamics (mass, moment of inertia, drag coefficients etc.), geometry, design, immediate environment of the targetdesign, immediate environment of the target such as such as air pressure, air air pressure, air temperature, wind velocity humiditytemperature, wind velocity humidity, etc. govern the particular trajectory , etc. govern the particular trajectory traversed by the target. traversed by the target. TheThe computing mocomputing module 250 predicts the trajectory dule 250 predicts the trajectory path of the path of the selected selected target 50 target 50 as a function of time fromas a function of time from instance of instance of itsits 20 detection detection by the peripheral target observation device 100. by the peripheral target observation device 100.
In accordance with one working example of present embodiment,
In accordance with one working example of present embodiment, the computing module 250 is configured to the computing module 250 is configured to predict position, target velocity, predict position, target velocity, target trajectory and direction estimates of the selected target 50target trajectory and direction estimates of the selected target 50 usingusing a a combination of tracking filterscombination of tracking filters. These tracking filters can be. These tracking filters can be selected from selected from 25 a group comprising Extendea group comprising Extended Kalman Filter (EKF), Kalman filter (KF), d Kalman Filter (EKF), Kalman filter (KF), Particle filter or Bayesian filter and Backpropagation trained neural Particle filter or Bayesian filter and Backpropagation trained neural network modelnetwork model for detecting trajectory path and predicting future position for detecting trajectory path and predicting future position of the target of the target 50 50 based on historical data of such target based on historical data of such target 50 50 capturedcaptured from from continuous frames of videocontinuous frames of video of the surrounding environmentof the surrounding environment. . 30
14
The
The computing module 250computing module 250 is trained in real time with data is trained in real time with data pertaining to trajectory traced inpertaining to trajectory traced in a predetermined period of time a predetermined period of time travelledtravelled by the target 50by the target 50 for continual autonomous tracking. In one example for continual autonomous tracking. In one example embodiment, Extended Kalman Filter (EKF), Kalman filter, Particle filter or embodiment, Extended Kalman Filter (EKF), Kalman filter, Particle filter or Bayesian filter is made operable by taking velocity Bayesian filter is made operable by taking velocity and and position estimates position estimates 5 for a targetfor a target 50 50 and then predicting where the target and then predicting where the target 550 0 will be in the next will be in the next frame or instanceframe or instance. . TheThe actualactual position of the target position of the target 5050 in the next video in the next video frame is then compared frame is then compared withwith the predicted position and the velocity, the predicted position and the velocity, position and position and orientation orientation estimates are updateestimates are updated. d.
In accordance with one working embodiment,
In accordance with one working embodiment, Kalman filter (or Kalman filter (or 10 another variant of the Bayesian filter) may be executed along with other another variant of the Bayesian filter) may be executed along with other complementary filterscomplementary filters-- Bayesian/Markov methods that are used to fuse Bayesian/Markov methods that are used to fuse data obtained from one or more data obtained from one or more sourcessources.. In another embodiment, realIn another embodiment, real--time coordinate time coordinate position of a dynamically moving target is captured based position of a dynamically moving target is captured based on backpropagation (BP) neural network modelon backpropagation (BP) neural network model or any other preferred or any other preferred 15 neural network modelneural network model, whereby the track data of moving target is learned , whereby the track data of moving target is learned and model trained for target future track prediction. In and model trained for target future track prediction. In general, the BP general, the BP neural network model includes the input layer, hidden layer, and output neural network model includes the input layer, hidden layer, and output layer, and the network propagates backward and constantly adjusts the layer, and the network propagates backward and constantly adjusts the weights between the input layer and the hidden layer, and the weights weights between the input layer and the hidden layer, and the weights 20 between the hidden layer andbetween the hidden layer and the output layer to minimize errorsthe output layer to minimize errors. .
The challenging and uncertain scenario of determining target
The challenging and uncertain scenario of determining target movement is accomplished using BP neural network that can realize any movement is accomplished using BP neural network that can realize any nonnon--linear mapping from the mlinear mapping from the m--dimensional input to the ndimensional input to the n--dimensional dimensional output to betteroutput to better fit the nonfit the non--linear curve according to the target historical linear curve according to the target historical 25 track data, thus improving the track prediction performanctrack data, thus improving the track prediction performance. In one e. In one working embodiment of present disclosure, the data obtained regarding working embodiment of present disclosure, the data obtained regarding target track is pretarget track is pre--processed to eliminate outliers and bprocessed to eliminate outliers and biases in data that iases in data that results in improvedresults in improved prediction accuracy of the track. The data is then prediction accuracy of the track. The data is then normalized to reduce influence of maximum and minimum values in the normalized to reduce influence of maximum and minimum values in the 30
15
data during prediction of neural network, and improve computation speed
data during prediction of neural network, and improve computation speed of the neural network. of the neural network. In In this regard, the this regard, the system system architecture is universal architecture is universal and not tied to any specific learning algorithm, although certain learning and not tied to any specific learning algorithm, although certain learning algorithms may be beneficial in certain applications.algorithms may be beneficial in certain applications.
Post processing of data, BP neural network model is constructed
Post processing of data, BP neural network model is constructed 5 where the netwhere the network is first initialized, including initialization of weights and work is first initialized, including initialization of weights and thresholds, the number of neural network and neurons in each layer and thresholds, the number of neural network and neurons in each layer and the types of transfer functions, model training algorithms, number of the types of transfer functions, model training algorithms, number of iterations, and training objectives are defined iterations, and training objectives are defined for each layer. Now, the for each layer. Now, the predicted value of track is obtained which most accurately and robustly predicted value of track is obtained which most accurately and robustly 10 determines the target motion in real time/near real timedetermines the target motion in real time/near real time..
In accordance with one noteworthy embodiment, it is
In accordance with one noteworthy embodiment, it is to be to be understood that the TDRunderstood that the TDR 200200 receives the tactical data in its own frame of receives the tactical data in its own frame of referencereference, as mentioned above, as mentioned above. . NowNow, a coordinate transform , a coordinate transform has to behas to be carried out for carried out for transforming the targettransforming the target 50 50 position from TDRposition from TDR’s 200’s 200 frame of frame of 15 referencereference to HMDto HMD’s’s frame of referenceframe of reference for parallax correctionfor parallax correction,, particularly particularly in scenarios where the operator or wearer of HMD 500in scenarios where the operator or wearer of HMD 500 is distantly is distantly positioned positioned from from that of that of TDRTDR 200200. . Thus, next parallax shift between the two Thus, next parallax shift between the two frame of references (TDR 200 and HMD 500) is corrected to enable HMD frame of references (TDR 200 and HMD 500) is corrected to enable HMD 500 view500 view ofof the target 50 from Tthe target 50 from TDR’s frame of referenceDR’s frame of reference. The computing . The computing 20 module 250, thus, determines correction data from TDR’s frame of module 250, thus, determines correction data from TDR’s frame of reference to that of HMD’s by way of computing 6dof pose correction data reference to that of HMD’s by way of computing 6dof pose correction data in real time that comprises of in real time that comprises of translation and orientation offset between the translation and orientation offset between the TTDR 200 and the HMD 500. DR 200 and the HMD 500.
The correction data is now transmitted to a processing module 560,
The correction data is now transmitted to a processing module 560, 25 which is typically hosted at HMD’s 500 wearer end. The processing which is typically hosted at HMD’s 500 wearer end. The processing module module 560 560 now now performs the coordinate transformation of the tactical performs the coordinate transformation of the tactical data such that trajectory ofdata such that trajectory of the selected target 50 is transformedthe selected target 50 is transformed, as, as explained here below in detailexplained here below in detail. .
16
For simplicity of computation
For simplicity of computation and reference purposesand reference purposes, let’s consider , let’s consider TDR TDR 200 200 as source S, aerial target as source S, aerial target 50 as 50 as P, mixed P, mixed realityreality--basedbased HMDHMD 500 500 asas H, H, and a weapon system and a weapon system 6600 as 00 as GG. Referring . Referring to Fig. to Fig. 22, a, a tactical tactical threethree--dimensionaldimensional information pertaining to the aerial target P is received from information pertaining to the aerial target P is received from TDR S in a spherical coordinate TDR S in a spherical coordinate systemsystem. . Symbolically, tSymbolically, the transformation he transformation 5 of object a w.r.t. b of object a w.r.t. b ((
)
) hashas bebeenen represented represented hereto hereto as a 4x4 matrixas a 4x4 matrix..
= R11
= R11 R12R12 R13R13 xx
R21R21 R22R22 R23R23 yy
R31R31 R32R32 R33R33 zz
11 11 1 1 11 10
= [R
= [R3333 TransTrans3x13x1]], ,
where R
where R33xx3 3 = Rotation Matrix and Trans= Rotation Matrix and Trans3x13x1= Translation vector= Translation vector
Therefore, using above notations
Therefore, using above notations, a 3, a 3--dimensionaldimensional pose of aerial targetpose of aerial target 50 50 (P)(P) w.r.t. TDR w.r.t. TDR 200 (S) 200 (S) is is shown asshown as
.
. Next, Next, the the position of MR based position of MR based HMD HMD 500 (H) 500 (H) with respect to with respect to TDRTDR 200 (S)200 (S), , representedrepresented asas
is computed is computed 15 usingusing a combination of positioning methodsa combination of positioning methods. In . In particular aspects, real particular aspects, real timetime or near real timeor near real time dynamic location data of selected target dynamic location data of selected target 50 50 is is obtained using obtained using global positioning system (GPS)global positioning system (GPS) and inertial measurement and inertial measurement units (IMUs) readings units (IMUs) readings e.g.,e.g., its coordinates, trajectory, attitude, heading, its coordinates, trajectory, attitude, heading, etc. These readings are complemented with etc. These readings are complemented with realreal--time kinematictime kinematicss (RTK)(RTK) 20 GPS GPS to increase the accuracy of position data derived from the GPS / IMU to increase the accuracy of position data derived from the GPS / IMU readings.readings. ThThe RTK usually relies on a single observation reference point e RTK usually relies on a single observation reference point or interpolated virtual point to provide realor interpolated virtual point to provide real--time correction, thereby time correction, thereby providing providing mmmm--level accuracylevel accuracy under various operating conditionsunder various operating conditions. . Therefore, the Therefore, the HMD 500HMD 500 equipped with GPS/RTK and IMUequipped with GPS/RTK and IMU to obtain to obtain 25 absolute position absolute position of the target 50 of the target 50 along with along with true north bearing true north bearing and and
17
azimuth values
azimuth values for orientation offset correctionfor orientation offset correction between the TDR 200 and between the TDR 200 and HMD 500 frame of referencesHMD 500 frame of references..
In one example embodiment, GPSIn one example embodiment, GPS/GNSS RTK/GNSS RTK is enabled via a is enabled via a base stationbase station hosted ahosted at end of computing module 250 and t end of computing module 250 and rover hosted at rover hosted at end of processing module 560. Here, the base station transmitsend of processing module 560. Here, the base station transmits its its 5 absolute positionabsolute position, , 6dof pose partial correction data6dof pose partial correction data associated with RTK associated with RTK rover to the processing modurover to the processing module 560. le 560. NextNext, RTK rover hosted at end of the , RTK rover hosted at end of the processing module 560 processing module 560 isis configured to compute position of TDR 200 with configured to compute position of TDR 200 with respect to the HMD 500 for parallax correction based on the received respect to the HMD 500 for parallax correction based on the received absolute position of the base station, the 6dof pose partial correctioabsolute position of the base station, the 6dof pose partial correction data n data 10 of the rover and GPS readings of the roverof the rover and GPS readings of the rover.. Thus,Thus, with with the combination of the combination of GPS and RTK positioning system positional data down to GPS and RTK positioning system positional data down to millimetremillimetre resolutionresolution accuracy may be obtainedaccuracy may be obtained. However,. However, iit should be understood t should be understood that the concepts disclosed in the present disclosure are capable of being that the concepts disclosed in the present disclosure are capable of being implemented with different types of systems for acquiring implemented with different types of systems for acquiring accurate accurate global global 15 position data and are not limited to the specificposition data and are not limited to the specific types and numbers of such types and numbers of such devices devices described and depicted herein. described and depicted herein.
Now, in order to get the
Now, in order to get the 33--dimensionaldimensional pose of aerial targetpose of aerial target 5050 with with respect to MR based HMDrespect to MR based HMD 500500, , representedrepresented as as
,
, following equation is following equation is appliedapplied:: 20
X X
=
=
This implies
This implies offset correctionoffset correction or parallax deviation correctionor parallax deviation correction to to overlay virtual target on real targetoverlay virtual target on real target may bemay be represented as:represented as:
= (
= (
)
)--1 1 X X
(1)(1)
Where
Where
= 3
= 3--dimensionaldimensional pose of TDR pose of TDR 200 200 w.r.t. MR based HMD w.r.t. MR based HMD 500500; ; 25
18
As explained above, f
As explained above, foror generatinggenerating
,
, trtranslationanslation position of the position of the targettarget 50 50 isis obtained from obtained from combination of combination of GPS, RTK and IMUGPS, RTK and IMU readingsreadings, , while while range of range of rotation is rotation is computedcomputed using True North bearing and IMUusing True North bearing and IMU readingsreadings. .
In
In nextnext exemplary embodiment, referring to exemplary embodiment, referring to Fig. 3Fig. 3 tactical tactical 5 information such as 3dof position of aerial target 50 is rendered over the information such as 3dof position of aerial target 50 is rendered over the mixed reality based HMD 500 in the form of a hollow blip at all times. mixed reality based HMD 500 in the form of a hollow blip at all times. Thus, Thus, a virtual target is spawned in HMD 500 along with the visual a virtual target is spawned in HMD 500 along with the visual directional cues to identify and locate the spawned target 50 in the directional cues to identify and locate the spawned target 50 in the airspace.airspace. However, However, in one exemplary embodiment, in one exemplary embodiment, for wearer of HMD 500 for wearer of HMD 500 10 who maywho may alsoalso bebe the operator the operator of weapon system 600, the vof weapon system 600, the virtual target irtual target viewable through HMD may have to be aligned for accurate aiming and viewable through HMD may have to be aligned for accurate aiming and shooting. In general, aligning the target with that of weapon is referred asshooting. In general, aligning the target with that of weapon is referred as zeroing in of weapon zeroing in of weapon 660000. .
Weapon Zeroing in is o
Weapon Zeroing in is one of the most essentialne of the most essential principles principles 15 underpinning the effective locking in of the intended target. underpinning the effective locking in of the intended target. ItIt involves involves setting and calibrating the sights to enhance firing accuracy. This zeroing setting and calibrating the sights to enhance firing accuracy. This zeroing process is one of the most critical elements of accurate target process is one of the most critical elements of accurate target engagement. To accurately engaengagement. To accurately engage targets, the strike of a bullet must ge targets, the strike of a bullet must coincide with the aiming point (Point of Aim/Point of Impact) on the targetcoincide with the aiming point (Point of Aim/Point of Impact) on the target 20 5050. . In general, iIn general, in order to effectively use a weaponn order to effectively use a weapon 600600, it is imperative for , it is imperative for the weaponthe weapon 600600 to be properly calibrated and zeroed in with tto be properly calibrated and zeroed in with the sight for he sight for securing enhanced functional and operational survivability in a dynamic, securing enhanced functional and operational survivability in a dynamic, hostile situation hostile situation such as a battlefield.such as a battlefield.
However, in present context,
However, in present context, the real target is viewable to operator the real target is viewable to operator 25 as a virtual target displayed over his worn HMD 500. For calas a virtual target displayed over his worn HMD 500. For calibration ibration purposes, frame of reference for the weapon system 600 is required to be purposes, frame of reference for the weapon system 600 is required to be aligned with that of HMD’s 500 frame of referencealigned with that of HMD’s 500 frame of reference, where, where such such
19
transformation between the two frames
transformation between the two frames
is computed using equation 1.
is computed using equation 1. Now, Now, choosingchoosing MR based HMD MR based HMD 500 500 as frame of referenceas frame of reference,, 33--dimensionaldimensional pose of weapon systempose of weapon system 6600 ‘G’00 ‘G’ with respect to HMD glasseswith respect to HMD glasses 500 ‘H’500 ‘H’ is is requiredrequired to be cto be computedomputed = =
In one explanatory embodiment,
In one explanatory embodiment,
values values cancan be computed be computed 5 usingusing a combination ofa combination of proximity sensors and IMUsproximity sensors and IMUs that can be that can be strategically arranged on at least one side of HMD 500 that is adjacent to strategically arranged on at least one side of HMD 500 that is adjacent to weapon system weapon system 6600. Likewise, another set of IMU and proximity sensors00. Likewise, another set of IMU and proximity sensors can be arranged on weapon system can be arranged on weapon system 660000 that is held that is held in vicinity of the HMD in vicinity of the HMD 500 wearing operator500 wearing operator.. Further equivalents, alternatives and modifications Further equivalents, alternatives and modifications 10 of above computation are also possible as would be recognized by those of above computation are also possible as would be recognized by those skilled in the art.skilled in the art. Using similar logic as above, pose of aerial targetUsing similar logic as above, pose of aerial target 5050 ‘P’ ‘P’ with respect to weaponwith respect to weapon system system 660000 ‘G‘G’ ’ can be foundcan be found using following using following equationequation::
= (
= (
)
)--1 1 X X
=
=
X
X
15
Once the
Once the alignment of these frame of references alignment of these frame of references is successfully is successfully achieved, the wearer of HMD 500 achieved, the wearer of HMD 500 viewsviews a virtual target a virtual target 50’ 50’ spawned in spawned in HMD 500 along with the visual directional cues to identify and locate the HMD 500 along with the visual directional cues to identify and locate the spawned target 50 in the airspace. spawned target 50 in the airspace. As can be seen in As can be seen in Fig.Fig. 3,3, tthe virtual he virtual 20 targettarget 50’ 50’ can be located as a virtual blip that iscan be located as a virtual blip that is in principlein principle overlaid on the overlaid on the real targetreal target 5050. In one example embodiment, the wearer is rendered with . In one example embodiment, the wearer is rendered with virtual situational cues along with the virtual blip on display of the target 50 virtual situational cues along with the virtual blip on display of the target 50 current position for accurate engagement of the real targetcurrent position for accurate engagement of the real target 5050..
In one implementation, the HMD 500 may use raycasting technique
In one implementation, the HMD 500 may use raycasting technique 25 to determineto determine path that has to be travelled by the ammunition fired from the path that has to be travelled by the ammunition fired from the
20
weaponry system 600
weaponry system 600. In various implementations, the raycasting . In various implementations, the raycasting technique can include casting a thin bundle of rays with substantially no technique can include casting a thin bundle of rays with substantially no width, or a ray with substantial width (ewidth, or a ray with substantial width (e..gg.. a cone ora cone or cone).cone). The operator, The operator, thus, thus, views the virtual ray emanating from the weapon system 600 via views the virtual ray emanating from the weapon system 600 via HMD 500 such that the virtual target 50’ coincides with the virtual HMD 500 such that the virtual target 50’ coincides with the virtual 5 ray/reticleray/reticle for accurate aiming and firingfor accurate aiming and firing of the real target of the real target 50 50 which is which is overlaid with the overlaid with the virtual target 50’virtual target 50’..
In another alternative scenario, the virtual blip may take form
In another alternative scenario, the virtual blip may take form of of virtual lasers and virtual crosshair pointers for target sighting, tracking, virtual lasers and virtual crosshair pointers for target sighting, tracking, locking and engagementlocking and engagement as adjustable overlays over the MR as adjustable overlays over the MR based based 10 glasses of the HMD 500; and glasses of the HMD 500; and enable target sighting & locking, weapon enable target sighting & locking, weapon deploymentdeployment and and Beyond Line of Sight (BLOS) capability.Beyond Line of Sight (BLOS) capability.
To achieve the above features
To achieve the above features, the overall schema of HMD , the overall schema of HMD functionality is partitioned into hardware and software domains. In one functionality is partitioned into hardware and software domains. In one embodiment, the dedicated hardware provides a Mixed Reality (MR)embodiment, the dedicated hardware provides a Mixed Reality (MR)--15 based HMD based HMD 500500, the HMD having at least MR glasses, the HMD having at least MR glasses 510, a 510, a communication module 520, one or mcommunication module 520, one or more cameras 530, a display unit 540,ore cameras 530, a display unit 540, an audio unit 550, a processing module 560an audio unit 550, a processing module 560, , and a plurality of dockand a plurality of dock--able able sensors 570 (a, b, c…n) mounted on one or more external devices sensors 570 (a, b, c…n) mounted on one or more external devices 7700 00 wirelessly connected with the HMD 500. wirelessly connected with the HMD 500. 20
The
The one or more external devices one or more external devices 7700 00 sselected from external elected from external cameras, weapon firing system, aiming device installed on handheld cameras, weapon firing system, aiming device installed on handheld weapon system weapon system 6600 such as a gun, rifle etc., 00 such as a gun, rifle etc., uunmanned Aerial vehicles nmanned Aerial vehicles (UAVs), external cameras, other HMDs(UAVs), external cameras, other HMDs, , external computing devicesexternal computing devices. . Further, the processing module Further, the processing module 560 is configured to receive data from the 560 is configured to receive data from the 25 one or more cameras 530 and/or the one or more dockone or more cameras 530 and/or the one or more dock--able sensors 570 able sensors 570 on the one or more external devices on the one or more external devices 7700, the data being accumulated with 00, the data being accumulated with respect to selected target 50 including the target’s surrounding respect to selected target 50 including the target’s surrounding enviroenvironment, situational awareness, navigation, binocular vision, weather nment, situational awareness, navigation, binocular vision, weather conditions and presence of objects & humans. conditions and presence of objects & humans. In one additional In one additional 30
21
embodiment, t
embodiment, this data is processed by the processing module 560 to his data is processed by the processing module 560 to further determine information related to target detection,further determine information related to target detection, IFF (Identification IFF (Identification of Friend or Foe), locations of targets & teamof Friend or Foe), locations of targets & team--mates, velocity & distance mates, velocity & distance estimation, weapon information etc. estimation, weapon information etc.
In
In another aspect of present invention,another aspect of present invention, the method provides virtual the method provides virtual 5 aid in the form of situational cues for identifying and locating the aerial aid in the form of situational cues for identifying and locating the aerial target 50 and aiding in actual engagement with the target 50 using virtual target 50 and aiding in actual engagement with the target 50 using virtual sights in the form of a virtual laser pointer or laser blip with a raycastsights in the form of a virtual laser pointer or laser blip with a raycastiningg technique as described below:technique as described below:
1.
1. The The object observation device (radar) 100 object observation device (radar) 100 provides the world provides the world 10 coordinates of the coordinates of the real real target target 50 50 in the spherical coordinate system. A in the spherical coordinate system. A virtual target virtual target 50’ 50’ is spawned in mixed reality head mounted device (HMD) is spawned in mixed reality head mounted device (HMD) 500 500 along withalong with the visual directional cues to find the spawned target in the the visual directional cues to find the spawned target in the airspace.airspace.
2.
2. A virtual gaze in the form of a crosshair is overlayed in mixed A virtual gaze in the form of a crosshair is overlayed in mixed 15 reality head mounted devicereality head mounted device 500500 vision.vision.
3.
3. The barrel of the gun/weapon systemThe barrel of the gun/weapon system 660000 is tracked to give six is tracked to give six degree of degree of freedom (6dof) position in world coordinates using precise freedom (6dof) position in world coordinates using precise object trackingobject tracking..
4.
4. The position of the weapon The position of the weapon 6600 00 and its aiming direction is shown and its aiming direction is shown 20 in the mixed reality head mounted device in the mixed reality head mounted device 500 500 in the form of a virtual sight/ in the form of a virtual sight/ rayray--castcast..
5.
5. The virtual alignment The virtual alignment would encompasswould encompass-- aaligning the HMDligning the HMD 500500 gaze crosshair with the target overlay and aligning the virtual sight with the gaze crosshair with the target overlay and aligning the virtual sight with the HMD gaze crosshair and the target. As all the three overlays are in realHMD gaze crosshair and the target. As all the three overlays are in real--25 world coordinates, these can be used as an aid for the actual aimiworld coordinates, these can be used as an aid for the actual aiming and ng and engaging with the targetengaging with the target 5050..
6.
6. The alignment and the shape of the rayThe alignment and the shape of the ray--cast take into cast take into
22
consideration the weapon specifications, the current positions of the
consideration the weapon specifications, the current positions of the target, and the related ballistic calculation for perfect zeroing.target, and the related ballistic calculation for perfect zeroing.
7.
7. Depth Depth occlusion and mapping using mixed reality HMDocclusion and mapping using mixed reality HMD 500500 may may bebe used for finding the intersection of the rayused for finding the intersection of the ray--cast and the cast and the virtual virtual targettarget that overlays the real targetthat overlays the real target. This intersection is used to render a virtual . This intersection is used to render a virtual 5 overlay like a bullseye. This is an indication thoverlay like a bullseye. This is an indication that the required alignment at the required alignment has been achieved. Different shapes and colors may denote the has been achieved. Different shapes and colors may denote the confidence/probability of hitting the target as computed by the system.confidence/probability of hitting the target as computed by the system.
8.
8. The specific HMDThe specific HMD 500500 integrates integrates Inertial Measurement Unit Inertial Measurement Unit (IMU), GPS and optical sensors. Usi(IMU), GPS and optical sensors. Using these sensors, the 6dof position of ng these sensors, the 6dof position of 10 the head as well as the true North direction is computed accuratelythe head as well as the true North direction is computed accurately..
9.
9. The unique graphical user interface (GUI) for the system shows The unique graphical user interface (GUI) for the system shows the elevation and azimuth relative to the true North of the sight/head. The the elevation and azimuth relative to the true North of the sight/head. The GUI alGUI also shows cues in the form of directional arrows which help to locate so shows cues in the form of directional arrows which help to locate the targetthe target.. The GUI is customized according to the weapon system that it The GUI is customized according to the weapon system that it 15 is to be used for. The system might or might not be used with radar based is to be used for. The system might or might not be used with radar based systems for target tracking. The GUI shosystems for target tracking. The GUI shows the IFF values (differentiation ws the IFF values (differentiation between friendly and foe targets), target ids, target current velocity, between friendly and foe targets), target ids, target current velocity, heading and position. The GUI is equipped with warning systems to heading and position. The GUI is equipped with warning systems to indicate if certain target is within different range limits. Digital zooming, indicate if certain target is within different range limits. Digital zooming, 20 toggltoggling oning on--off different GUI features are supported. The GUI can show off different GUI features are supported. The GUI can show different situational awareness information like weather condition, different situational awareness information like weather condition, ammunition status, information about systems in the same troop, ammunition status, information about systems in the same troop, thermal/night vision feed, tank detection, vehicle detectthermal/night vision feed, tank detection, vehicle detection, human ion, human detection etc.detection etc. 25
In one important aspect of present disclosure, the system is made
In one important aspect of present disclosure, the system is made to effectively operable for all weather conditions. With the overlaying of to effectively operable for all weather conditions. With the overlaying of virtual target virtual target 50’ 50’ over the real targetover the real target 5050 and displayed as a hollow blipand displayed as a hollow blip, the , the allall--timetime visibility and tracking of real target visibility and tracking of real target 50 50 is achieved even under bad is achieved even under bad
23
weather conditions
weather conditions ((fog, smokefog, smoke, smog, clouds, etc.), smog, clouds, etc.).. In In anotheranother implementation,implementation, situational cuessituational cues may appearmay appear in front ofin front of the HMD displaythe HMD display 500500 in a form that neither occludes the objects onin a form that neither occludes the objects on display, nor distracts the display, nor distracts the wearer from the content being shown. wearer from the content being shown. Thus, these cues do not block the Thus, these cues do not block the wearer vision, and the system does not inadvertently emphasize these wearer vision, and the system does not inadvertently emphasize these 5 cues that may appear obstructive in user’s clear aiming of the targetcues that may appear obstructive in user’s clear aiming of the target 5050..
In accordance with other significant aspect of disclosure, the HMD
In accordance with other significant aspect of disclosure, the HMD 500 is provided with 500 is provided with an an advanced optical element to reduce glare from advanced optical element to reduce glare from sunlight or other source of bright light autonomously without sunlight or other source of bright light autonomously without requiring anyrequiring any manual interventionmanual intervention to make adjustments in to make adjustments in the amount of light being the amount of light being 10 allowed to pass throughallowed to pass through.. Generally, under scenarios where transparent Generally, under scenarios where transparent HMD glasses are used, the virtual overlays in display unit of HMD HMD glasses are used, the virtual overlays in display unit of HMD 500 500 are are rendered translucent to transparent in light backdrops making their rendered translucent to transparent in light backdrops making their visibility painfulvisibility painfully difficult. In order to overcome this limitation, glasses may ly difficult. In order to overcome this limitation, glasses may be coated with an advanced optical element or film that can autonomously be coated with an advanced optical element or film that can autonomously 15 switch visibility parameters of the HMD with dynamically changing outside switch visibility parameters of the HMD with dynamically changing outside weather conditions. weather conditions.
In one working embodimen
In one working embodiment, one or more ambient t, one or more ambient light light sensors are sensors are provisioned on HMD 500 that gather data of surrounding outside weather provisioned on HMD 500 that gather data of surrounding outside weather conditions and input it to the optical element of HMD for electric conditions and input it to the optical element of HMD for electric 20 stimulation and eventually glass tint/opacity modulation. Precisely, tstimulation and eventually glass tint/opacity modulation. Precisely, the he optioptical element herein comprises of an electrochromic element cal element herein comprises of an electrochromic element 590 590 that is that is configured to monitor, adjust and limit the light by way of changing their configured to monitor, adjust and limit the light by way of changing their opacity and/or opacity and/or colourcolour in response to electric stimulation, such as in response to electric stimulation, such as application of voltageapplication of voltage as generated in respoas generated in response to input received from the nse to input received from the 25 ambient ambient light light sensorssensors. . For example, higher the voltage, greater the opacity For example, higher the voltage, greater the opacity of HMD glassesof HMD glasses 500500 is made for clear and distinct viewing of virtual is made for clear and distinct viewing of virtual overlays. overlays. In one working embodiment, such an electrochromic element In one working embodiment, such an electrochromic element may comprise of an entire region of the optical element or present only in may comprise of an entire region of the optical element or present only in
24
some portion thereof.
some portion thereof. It is however appreciated that such specific It is however appreciated that such specific configurations are merely illustrative and not intended configurations are merely illustrative and not intended to be limiting.to be limiting.
Accordingly, the electrochromic element
Accordingly, the electrochromic element 590 590 may be electrically may be electrically actuatedactuated, which , which results in an increase in opacity of the HMD 500. Here, results in an increase in opacity of the HMD 500. Here, the degree or level of opacity may be determined based on plurality of the degree or level of opacity may be determined based on plurality of 5 parameters such as duration andparameters such as duration and/or amplitude and/or form and/or /or amplitude and/or form and/or frequency of the applied electrical signalfrequency of the applied electrical signal. . The change in opacity refers to The change in opacity refers to changing a colour, shade, hue, gamma, clarity, transmittance, light changing a colour, shade, hue, gamma, clarity, transmittance, light scattering, polarization, other optical characteristics, attach time, decay scattering, polarization, other optical characteristics, attach time, decay timtime, shape, outline, pattern, and size of said at least one region. e, shape, outline, pattern, and size of said at least one region. In In 10 another event, the HMD 500 may be quickly returned back to its nonanother event, the HMD 500 may be quickly returned back to its non--opaque condition in seconds. opaque condition in seconds.
In o
In one other aspect of the present invention,ne other aspect of the present invention, the HMD 500 is the HMD 500 is configured with a display configured with a display unit 540 unit 540 and a microphone and a microphone unit 550 unit 550 to provide an to provide an operator withoperator with a visual or audible warning a visual or audible warning that isthat is activated based on activated based on target target 15 range, range, target target speed, target type, speed, target type, target velocity and trajectory, target velocity and trajectory, IFF IFF displayed, visual Gdisplayed, visual GUIs, UIs, target target direction and range, lethality of target by direction and range, lethality of target by operator (thresholds as per weapon type/system)operator (thresholds as per weapon type/system),, weapon specific weapon specific instructionsinstructions etcetc.. The audio and video cautions and cues are provided to The audio and video cautions and cues are provided to the operator for taking corrective action and in one other embothe operator for taking corrective action and in one other embodiment, an diment, an 20 audio tone may be set with visual changes audio tone may be set with visual changes in symbology from nonin symbology from non--flashing flashing to flashing bright red alert.to flashing bright red alert.
The audio alert enables a uniquely tailored response
The audio alert enables a uniquely tailored response by the by the operator operator to any event or required action by integrating a definable range of to any event or required action by integrating a definable range of alert inputs with the audio alert notification for the ultimate in situational alert inputs with the audio alert notification for the ultimate in situational 25 awareness and response.awareness and response. The audio The audio featurefeature is integrated with preis integrated with pre--recorded, optimized messages (which may be voice messages, tones or recorded, optimized messages (which may be voice messages, tones or any other audible or visual triggers to signany other audible or visual triggers to signal) to allow the al) to allow the operatoroperator to to trigger the output upon the breach of any trigger the output upon the breach of any target associated rulestarget associated rules..
25
EXAMPLES
EXAMPLES
The present invention is described hereinafter by various
The present invention is described hereinafter by various embodiments. This invention may, however, be embodied in many embodiments. This invention may, however, be embodied in many different forms and should not different forms and should not be construed as limited to the embodiment be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure set forth herein. Rather, the embodiment is provided so that this disclosure 5 will be thorough and complete and will fully convey the scope of the will be thorough and complete and will fully convey the scope of the invention to those skilled in the art.invention to those skilled in the art.
In accordance with an exempla
In accordance with an exemplary embodiment of the present ry embodiment of the present invention, the system comprises of a plurality of computing devices and a invention, the system comprises of a plurality of computing devices and a plurality of dockplurality of dock--able sensors mounted on a military grade AR headset, able sensors mounted on a military grade AR headset, 10 operated by users using military grade Mixed Reality (MR) glasses. The operated by users using military grade Mixed Reality (MR) glasses. The system compsystem comprises of one or more image capturing modules, one or more rises of one or more image capturing modules, one or more RGB cameras, ToF (time of flight) or Depth cameras, and IR stereoscopic RGB cameras, ToF (time of flight) or Depth cameras, and IR stereoscopic cameras. The plurality of computing devices comprises of, but not limited cameras. The plurality of computing devices comprises of, but not limited to, a microphone, a speaker, a user interface, and ato, a microphone, a speaker, a user interface, and an artificial intelligence n artificial intelligence 15 module. Further, the computing devices include a plurality of electronic module. Further, the computing devices include a plurality of electronic components such as a microprocessor, a memory unit, a power source, components such as a microprocessor, a memory unit, a power source, and a user interface. The user interface may be activated or utilized by the and a user interface. The user interface may be activated or utilized by the user by presuser by pressing a button or hovering the hand and/or other body parts or sing a button or hovering the hand and/or other body parts or providing audio input and/or tactile input through one or more fingers. The providing audio input and/or tactile input through one or more fingers. The 20 plurality of computing devices maybe one or more of, but not limited to, a plurality of computing devices maybe one or more of, but not limited to, a wearable device such as a Head Mounted Deviwearable device such as a Head Mounted Device (HMD) or smart ce (HMD) or smart eyewear glasses. Further, the one or more Dockeyewear glasses. Further, the one or more Dock--able sensors include, but able sensors include, but not limited to threat detection sensors, infrared sensors, nightnot limited to threat detection sensors, infrared sensors, night--vision vision sensors, a thermal sensor, IFF (identification friend or foe), Lidar (Light sensors, a thermal sensor, IFF (identification friend or foe), Lidar (Light 25 Detection and RanDetection and Ranging), Blue Force Tracking, SONAR and GPS.ging), Blue Force Tracking, SONAR and GPS.
In accordance with an exemplary embodiment of the present
In accordance with an exemplary embodiment of the present invention, the plurality of computing devices may include, but not limited invention, the plurality of computing devices may include, but not limited to, a wearable device such as a Head Mounted Device (HMD) or smart to, a wearable device such as a Head Mounted Device (HMD) or smart eyewear glasseyewear glasses. The plurality of computing devices is envisaged to es. The plurality of computing devices is envisaged to 30
26
include computing capabilities such as a memory unit configured to store
include computing capabilities such as a memory unit configured to store machine readable instructions. The machinemachine readable instructions. The machine--readable instructions may be readable instructions may be loaded into the memory unit from a nonloaded into the memory unit from a non--transitory machintransitory machinee--readable readable medium, such as but not limited to, CDmedium, such as but not limited to, CD--ROMs, DVDROMs, DVD--ROMs, and Flash ROMs, and Flash Drives. Alternatively, the machineDrives. Alternatively, the machine--readable instructions may be loaded in readable instructions may be loaded in 5 the form of a computer software program into the memory unit. The the form of a computer software program into the memory unit. The memory unit in that manner may be selectmemory unit in that manner may be selected from a group comprising ed from a group comprising EPROM, EEPROM and Flash memory. EPROM, EEPROM and Flash memory.
In accordance with an exemplary embodiment of the present
In accordance with an exemplary embodiment of the present invention, the militaryinvention, the military--grade headset includes, but not limited to one or grade headset includes, but not limited to one or 10 more glasses, one or more image capturing module, one or more IRmore glasses, one or more image capturing module, one or more IR stereoscopic cameras, one or more RGB cameras, TOF or depth stereoscopic cameras, one or more RGB cameras, TOF or depth cameras, one or more microphone, and one or more Speaker. The one or cameras, one or more microphone, and one or more Speaker. The one or more glasses, image capturing module, RGB cameras, TOF or depth more glasses, image capturing module, RGB cameras, TOF or depth cameras, IR stereoscopic cameras, Inertial Measurement Unit (Icameras, IR stereoscopic cameras, Inertial Measurement Unit (IMU), MU), 15 microphone, and Speaker are operatively connected. The one or more microphone, and Speaker are operatively connected. The one or more glasses are configured to provide 60 degrees of Field of vision. The 60 glasses are configured to provide 60 degrees of Field of vision. The 60 degrees of vision provides a wider field of vision. In another aspect, the degrees of vision provides a wider field of vision. In another aspect, the system coupled with the glasses provisystem coupled with the glasses provides the user images and videos of des the user images and videos of targets and locations beyond the line of sight. targets and locations beyond the line of sight. 20
In accordance with an embodiment of the present invention, the In accordance with an embodiment of the present invention, the MR glasses are military grade and made of a material selected from a MR glasses are military grade and made of a material selected from a group comprising polycarbonate, aluminigroup comprising polycarbonate, aluminium alloy and rubber polymer.um alloy and rubber polymer.
In accordance with an embodiment of the present invention, the MR
In accordance with an embodiment of the present invention, the MR glasses are provided with UV protection and shock proof capability with glasses are provided with UV protection and shock proof capability with 25 antianti--scratch, antiscratch, anti--fog coating and electrochromic coating with which the fog coating and electrochromic coating with which the transparency of transparency of MR glasses is changed from dark shades to no tints, MR glasses is changed from dark shades to no tints, automatically or manually, based on surrounding lights and to adjust the automatically or manually, based on surrounding lights and to adjust the clarity of holograms, in order to withstand different conditions.clarity of holograms, in order to withstand different conditions.
27
In accordance with an embodiment of the present invention, the
In accordance with an embodiment of the present invention, the onone or more Docke or more Dock--able sensors include, threat detection sensors, infrared able sensors include, threat detection sensors, infrared sensors, night vision sensors, a thermal sensor, IFF (identification friend or sensors, night vision sensors, a thermal sensor, IFF (identification friend or foe), Lidar (Light Detection, and Ranging), Blue Force Tracking, SONAR foe), Lidar (Light Detection, and Ranging), Blue Force Tracking, SONAR and GPS. and GPS. 5
In accordance with an em
In accordance with an embodiment of the present invention, the bodiment of the present invention, the HMD is operated using one or more of physical buttons, handHMD is operated using one or more of physical buttons, hand--gestures, gestures, voice commands and gazevoice commands and gaze--tracking for interaction.tracking for interaction.
In accordance with an embodiment of the present invention enable
In accordance with an embodiment of the present invention enabless wireless communication between the HMD and one or more external wireless communication between the HMD and one or more external 10 devices selected from external cameras, devices selected from external cameras, weapon firing system, aiming weapon firing system, aiming device installed on handheld gundevice installed on handheld gun using the communication module; andusing the communication module; and receive sufficient information receive sufficient information for targetfor target sighting, lsighting, lockingocking,, and engagementand engagement which does not require information from the radarwhich does not require information from the radar. .
In accordance with an embodiment of the present invention, the
In accordance with an embodiment of the present invention, the 15 HMD MR glass can connect to the existing physical sight of the weapon HMD MR glass can connect to the existing physical sight of the weapon firing system, so the user can switch from the vfiring system, so the user can switch from the virtual sight to the actual irtual sight to the actual view of the physical sight in the glass user interface itself. This gives an view of the physical sight in the glass user interface itself. This gives an additional benefit of using the physical sight as a handheld camera to look additional benefit of using the physical sight as a handheld camera to look beyond the corners without endangering the user himself.beyond the corners without endangering the user himself. 20
In accordance with
In accordance with an embodiment of the present invention, the an embodiment of the present invention, the information received from the one or more external devices includes one information received from the one or more external devices includes one or more of live feed from external cameras and UAVs, live and enhanced or more of live feed from external cameras and UAVs, live and enhanced satellite images, information from the radar, weapon information, locsatellite images, information from the radar, weapon information, locations ations and audioand audio--visual data from other HMDs and audio or video information visual data from other HMDs and audio or video information 25 from external computing devices.from external computing devices.
In accordance with an embodiment of the present invention, the
In accordance with an embodiment of the present invention, the processing module is configured to project information received from radar processing module is configured to project information received from radar directly directly to the MR glasses and enable a user to lock and engage the target to the MR glasses and enable a user to lock and engage the target
28
without any additional human intervention.
without any additional human intervention.
In accordance with an exemplary embodiment of the present
In accordance with an exemplary embodiment of the present invention, the user interface is provided to enable the user to navigate invention, the user interface is provided to enable the user to navigate between variousbetween various mixed reality information overlays and use the sensor mixed reality information overlays and use the sensor data and information in the most efficient manner as per the user’s data and information in the most efficient manner as per the user’s 5 requirement without it being a hassle to the user. The exemplary user requirement without it being a hassle to the user. The exemplary user interface includes, but not limited to, one or more buttons, a interface includes, but not limited to, one or more buttons, a gesture gesture interface, an audio interface, and a touchinterface, an audio interface, and a touch--based interface, eyebased interface, eye--tracking tracking interface that tracks gaze and focus, EEGinterface that tracks gaze and focus, EEG--Based BrainBased Brain--Computer Computer Interface, and the like. Interface, and the like. 10
In accordance with an exemplary embodiment of the present
In accordance with an exemplary embodiment of the present invention, the system invention, the system provides Information visualization, intuitive interface, provides Information visualization, intuitive interface, nonnon--intrusive and adjustable overlays.intrusive and adjustable overlays.
The exemplary method of working of the system is discussed
The exemplary method of working of the system is discussed below. The method starts when the one or more IR stereoscopic cameras below. The method starts when the one or more IR stereoscopic cameras 15 of the system described above, of the system described above, along with other dockalong with other dock--able sensors, able sensors, microphone, capture the audio, visual, and situational data. The dockable microphone, capture the audio, visual, and situational data. The dockable sensors are used to sense the situation around the user. The information sensors are used to sense the situation around the user. The information read by the dock able sensors alerts the user about the threat. The data read by the dock able sensors alerts the user about the threat. The data captured by the camera and the dockable sensors is sent to the computing captured by the camera and the dockable sensors is sent to the computing 20 system for intelligently processing the data and give an assessment of the system for intelligently processing the data and give an assessment of the condition around the user. condition around the user.
It should be understood that the techniques of the present
It should be understood that the techniques of the present disclosure might be idisclosure might be implemented using a variety of technologies. For mplemented using a variety of technologies. For example, the methods described herein may be implemented by a series example, the methods described herein may be implemented by a series 25 of computerof computer--executable instructions residing on a suitable computerexecutable instructions residing on a suitable computer--readable medium. Suitable computerreadable medium. Suitable computer--readable media may include volatile readable media may include volatile (e.g. RAM) and/or non(e.g. RAM) and/or non--volatile (e.g. ROM, disk) memory, carrier waves, volatile (e.g. ROM, disk) memory, carrier waves, and transmission media. Exemplary carrier waves may take the form of and transmission media. Exemplary carrier waves may take the form of
29
electrical, electromagnetic or optical signals conveying digital data streams
electrical, electromagnetic or optical signals conveying digital data streams along with a local network or a publicalong with a local network or a publicly accessible network such as the ly accessible network such as the Internet.Internet.
It should also be understood that, unless specifically stated It should also be understood that, unless specifically stated otherwise as apparent from the following discussion, it is appreciated that otherwise as apparent from the following discussion, it is appreciated that 5 throughout the description, discussions utilizing terms such throughout the description, discussions utilizing terms such as "controlling" as "controlling" or "obtaining" or "computing" or "storing" or "receiving" or "determining" or or "obtaining" or "computing" or "storing" or "receiving" or "determining" or the like, refer to the action and processes of a computer system, or similar the like, refer to the action and processes of a computer system, or similar electronic computing device, that processes and transforms data electronic computing device, that processes and transforms data represented as physirepresented as physical (electronic) quantities within the computer cal (electronic) quantities within the computer 10 system's registers and memories into other data similarly represented as system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or physical quantities within the computer system memories or registers or other such information storage, transmission or display devicesother such information storage, transmission or display devices..
Various modifications to these embodiments are apparent to those Various modifications to these embodiments are apparent to those skilled in the art from the description and the accompanying drawings. The skilled in the art from the description and the accompanying drawings. The 15 principles associated with the various embodiments described herein may principles associated with the various embodiments described herein may be applied to other embodimentsbe applied to other embodiments. Therefore, the description is not . Therefore, the description is not intended to be limited to the embodiments shown along with the intended to be limited to the embodiments shown along with the accompanying drawings but is to be providing broadest scope of accompanying drawings but is to be providing broadest scope of consistent with the principles and the novel and inventive features consistent with the principles and the novel and inventive features 20 disclosed or suggested herdisclosed or suggested herein. Accordingly, the invention is anticipated to ein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention.within the scope of the present invention.

,CLAIMS:We Claim:
1) A system 1000 for tracking and locking one or more targets 50,
character ized in utilizing a wearable mixed reality based head mounted
display (HMD) 500, the system 1000 comprising:
a peripheral target observation device 100 configured to obtain
5 tactical data of the one or more targets 50;
a target data receiver (TDR) 200 conf
igured to:
receive and process the tactical data of the one or more targets
50 received from the peripheral target observation device 100; and
select at least one of the one or more targets 50 based on a
10 plurality of predetermined factors;
a computing mod
ule 250 communicatively coupled with the TDR
200 configured to receive and process trajectory data of the selected
target 50 and determine correction data from TDR’s 200 frame of
reference to HMD’s 500 frame of reference for transmission to a 15 processing mo dule 560;
the processing module 560 configured to perform coordinate
transformation of the tactical data such that trajectory of the selected
target 50 is transformed from the TDR’s 200 frame of reference to that
of HMD’s 500 frame of reference; 20
the wearab
le mixed reality based HMD 500 configured to render a
virtual target based on the transformed trajectory data, and overlay the
virtual target 50’ on the selected target 50 in a three dimensional
space; and
a weapon system 600 having a frame of reference al
igned with that 25 of the HMD’s 500 frame of reference, wherein the weapon system 600 is
manoeuvred such that a virtual reticle emanating from the weapon system
31
600 and viewed from the HMD 500, coincides with the virtual target 50’ for
600 and viewed from the HMD 500, coincides with the virtual target 50’ for accurate aiming at the accurate aiming at the selected target 50 so overlaid with the virtual target selected target 50 so overlaid with the virtual target 50’. 50’.
2) The system 1000, as claimed in claim 1, wherein the peripheral The system 1000, as claimed in claim 1, wherein the peripheral object observation device 100 is configured to receive a reflected wave of object observation device 100 is configured to receive a reflected wave of 5 an irradiated wave from the one or more targets 50 existan irradiated wave from the one or more targets 50 existing at an ing at an irradiation destination. irradiation destination.
3) The system 1000, as claimed in claim 1, wherein the tactical data of The system 1000, as claimed in claim 1, wherein the tactical data of the one or more targets 50 received by the target data receiver (TDR) 200 the one or more targets 50 received by the target data receiver (TDR) 200 is in a spherical coordinate system in the peripheral object observation is in a spherical coordinate system in the peripheral object observation 10 devidevice’s 100 frame of reference.ce’s 100 frame of reference.
4) The system 1000, as claimed in claim 1, wherein the TDR 200 is The system 1000, as claimed in claim 1, wherein the TDR 200 is configured to select the one or more targets 50 based on the plurality of configured to select the one or more targets 50 based on the plurality of predetermined factors comprising priority and estimation (PSQR) of target, predetermined factors comprising priority and estimation (PSQR) of target, target type, sptarget type, speed and distance of the target, lethality levels and the like.eed and distance of the target, lethality levels and the like. 15
5) The system 1000, as claimed in claim 1, wherein the target data The system 1000, as claimed in claim 1, wherein the target data receiver 200 is configured to process, decode and extract the tactical data receiver 200 is configured to process, decode and extract the tactical data to obtain a to obtain a unique identifier information of the one unique identifier information of the one or more targets 50 or more targets 50 assigned thereto by the peripheral object observation device 100, radial assigned thereto by the peripheral object observation device 100, radial distance and azimuthal angle of the target 50 from the peripheral object distance and azimuthal angle of the target 50 from the peripheral object 20 observation device 100, perpendicular height of the target 50 from the observation device 100, perpendicular height of the target 50 from the ground plane or baseground plane or base, heading angle of the target 50, speed of the target , heading angle of the target 50, speed of the target 50, IFF (identification of friend or foe) code of target 50, WCO (weapon 50, IFF (identification of friend or foe) code of target 50, WCO (weapon control orders) code of the target determined or computed by peripheral control orders) code of the target determined or computed by peripheral object observation device 100, advanced target position, object observation device 100, advanced target position, orientation and orientation and 25 relative velocity of the target 50.relative velocity of the target 50.
6) The system 1000, as claimed in claim 1, wherein the wearable The system 1000, as claimed in claim 1, wherein the wearable mixed reality HMD 500 further comprises of: mixed reality HMD 500 further comprises of:
at least mixed reality glasses 510;
at least mixed reality glasses 510;
32
a communication module 520;
a communication module 520;
one or more cameras 530;
one or more cameras 530;
a
a display unit 540;display unit 540;
an audio unit 550;
an audio unit 550;
a plurality of dockable sensors 570 configured to gather data
a plurality of dockable sensors 570 configured to gather data 5 related to the target, comprising related to the target, comprising target’s surrounding environment, target’s surrounding environment, situational awareness, navigation, binocular vision, weather situational awareness, navigation, binocular vision, weather conditions and presence of oconditions and presence of objects and humans;bjects and humans;
a processing module 560 configured to receive target data from the
a processing module 560 configured to receive target data from the one or more cameras 530 and the plurality of dockable sensors one or more cameras 530 and the plurality of dockable sensors 10 570; and process the received target data to determine target 570; and process the received target data to determine target location, IFF location, IFF (Identification of Friend or Fo(Identification of Friend or Foe), velocity & distance e), velocity & distance estimation, weapon information.estimation, weapon information.
7) The system 1000, as claimed in claim 1, wherein the computing The system 1000, as claimed in claim 1, wherein the computing module is configured to receive and process the trajectory data of the module is configured to receive and process the trajectory data of the 15 selected target 50 in real time for interpolating the target trajecselected target 50 in real time for interpolating the target trajectory path tory path and prediction of future path, wherein the interpolation and prediction of and prediction of future path, wherein the interpolation and prediction of the target trajectory is based on historical data of track traced by the the target trajectory is based on historical data of track traced by the selected target 50 for a predetermined time period from instance of its selected target 50 for a predetermined time period from instance of its detection by the periphedetection by the peripheral target observation device 100.ral target observation device 100. 20
8) The system 1000, as claimed in claim 1, wherein the computing The system 1000, as claimed in claim 1, wherein the computing module 250 is configured to utilize a combination of tracking filters module 250 is configured to utilize a combination of tracking filters selected from a group comprising selected from a group comprising Extended Kalman Filter (EKF), Kalman Extended Kalman Filter (EKF), Kalman filter (KF), Particfilter (KF), Particle filter or Bayesian filter and neural network model to le filter or Bayesian filter and neural network model to predict position, target velocity, target trajectory and direction estimates of predict position, target velocity, target trajectory and direction estimates of 25 the selected target 50.the selected target 50.
9) The system 1000, as claimed in claim 1, wThe system 1000, as claimed in claim 1, wherein the computing herein the computing module is configured to compute 6module is configured to compute 6dof pose correction data in real time dof pose correction data in real time
33
consisting of translation and orientation offset between the TDR 200 and
consisting of translation and orientation offset between the TDR 200 and the HMD 500.the HMD 500.
10) The system 1000, as claimed in claim 10, wThe system 1000, as claimed in claim 10, wherein the computing herein the computing module 250 is configured to estimate translation position of the HMD module 250 is configured to estimate translation position of the HMD 500 500 with respect to the TDR 200 from readings obtained from global with respect to the TDR 200 from readings obtained from global 5 positioning system (GPS), inertial measurement units (IMUs) and real time positioning system (GPS), inertial measurement units (IMUs) and real time kinematics (RTK) GPS. kinematics (RTK) GPS.
11) The system 1000, as claimed in claim 10, wThe system 1000, as claimed in claim 10, wherein the computing herein the computing module is configured to compmodule is configured to compute orientation offset between HMD 500 and ute orientation offset between HMD 500 and TDR 200 using true north bearing computed from IMU and GPS readings.TDR 200 using true north bearing computed from IMU and GPS readings. 10
12) The system 1000, as claimed in claim 1, wherein the processing The system 1000, as claimed in claim 1, wherein the processing module 560 is configured to:module 560 is configured to:
calculate and estimate 6 dof pose of the HMD 500;
calculate and estimate 6 dof pose of the HMD 500;
r
receive the trajectory data and correction data in TDR’s 200 frame eceive the trajectory data and correction data in TDR’s 200 frame of reference from the computing module 250;of reference from the computing module 250; 15
perform coordinate transformation of the tactical data utilizing the
perform coordinate transformation of the tactical data utilizing the 6dof correction data received from the computing module 250, HMD’s 6dof correction data received from the computing module 250, HMD’s estimated estimated 6 dof pose and the trajectory data for parallax correction and 6 dof pose and the trajectory data for parallax correction and transforming the target trajectory from the TDR’s frame of reference to transforming the target trajectory from the TDR’s frame of reference to that of HMD’s 500 frame of reference.that of HMD’s 500 frame of reference. 20
13) The system 1000, as claimed in claim 13, fThe system 1000, as claimed in claim 13, further comprising:urther comprising:
a GNSS RTK base statio
a GNSS RTK base station hosted at end of the computing module n hosted at end of the computing module 250 and configured to transmit absolute position thereof, 6dof pose 250 and configured to transmit absolute position thereof, 6dof pose partial correction data associated with a GNSS RTK rover to the partial correction data associated with a GNSS RTK rover to the processing module 560; and processing module 560; and 25
the GNSS RTK rover hosted at end of the processing modul
the GNSS RTK rover hosted at end of the processing module 560 e 560 and configured to compute position of TDR 200 with respect to the and configured to compute position of TDR 200 with respect to the
34
HMD 500 for parallax correction based on the received absolute
HMD 500 for parallax correction based on the received absolute position of the base station, the 6dof pose partial correction data of the position of the base station, the 6dof pose partial correction data of the rover and GPS readings of the roverrover and GPS readings of the rover..
14) The systThe system 1000, as claimed in claim 1, wherein the HMD 500 is em 1000, as claimed in claim 1, wherein the HMD 500 is configured to detect and track the weapon system 600 to compute 6dof configured to detect and track the weapon system 600 to compute 6dof 5 pose of the weapon system 600 and view the emanated virtual reticle from pose of the weapon system 600 and view the emanated virtual reticle from centre of said weapon system 600, wherein the viewing of emcentre of said weapon system 600, wherein the viewing of emanated anated virtual reticle serves as a virtual aid in manoeuvring and target aiming of virtual reticle serves as a virtual aid in manoeuvring and target aiming of the weapon system 600.the weapon system 600.
15) The system 1000, as claimed in claim 15, further comprising:The system 1000, as claimed in claim 15, further comprising: 10
one or more sensors selected from inertial measurement unit
one or more sensors selected from inertial measurement unit (IMU), proximity sensors plac(IMU), proximity sensors placed at HMD 500 and on the weapon ed at HMD 500 and on the weapon system 600; orsystem 600; or
one or more cameras of the HMD 500 to perform visual 3
one or more cameras of the HMD 500 to perform visual 3--dimensional tracking;dimensional tracking; 15
or a combination thereof to track the 6dof pose of the weapon
or a combination thereof to track the 6dof pose of the weapon system 600 in real time and align the frame of reference of the system 600 in real time and align the frame of reference of the weapon system 600 with that of the frame of reference of HMD 500.weapon system 600 with that of the frame of reference of HMD 500.
16) The system 1000, as claimed in claim 1, wherein the HMD 500 is The system 1000, as claimed in claim 1, wherein the HMD 500 is 20 configured to compute a ballistic parametric curve of the emanated virtual configured to compute a ballistic parametric curve of the emanated virtual reticle based on weapon type, initial thrust, mass reticle based on weapon type, initial thrust, mass of ammunition, maximum of ammunition, maximum range of weapon system, air drag, gravity etc and in accordance range of weapon system, air drag, gravity etc and in accordance manoeuvre the weapon system 600 for target engagement. manoeuvre the weapon system 600 for target engagement.
17) The system 1000, as claimed in claim 1, wherein the HMD 500 is The system 1000, as claimed in claim 1, wherein the HMD 500 is 25 configured to display virtual target spawned as a hollow blip, the virtual configured to display virtual target spawned as a hollow blip, the virtual reticle providing a visual directional cue and situational cues for the target reticle providing a visual directional cue and situational cues for the target detection and locking. detection and locking.
35
18) The system 10The system 1000, as claimed in claim 1, wherein 00, as claimed in claim 1, wherein the HMD 500 is the HMD 500 is configured to display virtual target 50’ as overlaid over the real target 50 configured to display virtual target 50’ as overlaid over the real target 50 under under all weather conditions.all weather conditions.
19) The system 1000, as claimed in claim 1, wherein The system 1000, as claimed in claim 1, wherein the HMD 500 the HMD 500 further comprises of an further comprises of an electrochromic electrochromic optical elementoptical element 590 configured to 590 configured to 5 modulate opacity of the HMD in response to input received from one or modulate opacity of the HMD in response to input received from one or more ambient light sensors to enable the HMD function under all weather more ambient light sensors to enable the HMD function under all weather conditions.conditions.
20) A method 2000 for tracking and locking one or more targets 50, A method 2000 for tracking and locking one or more targets 50, charcharacterized in utilizing a wearable mixed reality based head mounted acterized in utilizing a wearable mixed reality based head mounted 10 display (HMD) 500, the method 2000 comprising:display (HMD) 500, the method 2000 comprising:
obtaining tactical data of the one or more targets 50 from a
obtaining tactical data of the one or more targets 50 from a peripheral target observation device 100 and transmitting the tactical peripheral target observation device 100 and transmitting the tactical data to a data to a target data receiver (TDR) 200;target data receiver (TDR) 200;
receiving and processing, at the TDR 200, the tactical data of the
receiving and processing, at the TDR 200, the tactical data of the 15 one or more targets 50, and selecting at least one of the one or one or more targets 50, and selecting at least one of the one or more targets 50 based on a plurality of predetermined factors; more targets 50 based on a plurality of predetermined factors;
receiving and processing, at
receiving and processing, at a computing module 250 a computing module 250 communicatively coupled with the TDR 200, trajectory data of the communicatively coupled with the TDR 200, trajectory data of the selected target 50 and determining correction data from TDR’s 200 selected target 50 and determining correction data from TDR’s 200 20 frame of reference to HMD’s 500 frame of reference for transmission to frame of reference to HMD’s 500 frame of reference for transmission to a processing module 560;a processing module 560;
perfo
performing, at a processing module 560, coordinate transformation rming, at a processing module 560, coordinate transformation of the tactical data such that trajectory of the selected target 50 is of the tactical data such that trajectory of the selected target 50 is transformed from the TDR’s 200 frame of reference to that of HMD’s transformed from the TDR’s 200 frame of reference to that of HMD’s 25 500 frame of reference;500 frame of reference;
rendering over a wearable mixed r
rendering over a wearable mixed reality based HMD 500, a virtual eality based HMD 500, a virtual target based on the transformed trajectory data, and overlaying the target based on the transformed trajectory data, and overlaying the
36
virtual target 50’ on the selected target 50 in a three dimensional
virtual target 50’ on the selected target 50 in a three dimensional space; andspace; and
aligning frame of reference of a weapon system 600 with that of the HMD’s 500 frame of reference, and manoeuvring the weapon system 600 such that a virtual reticle emanating from the weapon system 600 5 and viewed from the HMD 500, coincides with the virtual target 50’ for accurate aiming at the selected target 50 so overlaid with the virtual target 50’.
21) The method, as claimed in claim 20, wherein a reflected 10 wave of an irradiated wave from the one or more targets 50 existing at an irradiation destination.
22) The method, as claimed in claim 20, wherein the tactical data of the one or more targets 50 received by the target data receiver 15 (TDR) 200 is in a spherical coordinate system in the peripheral object observation device’s 100 frame of reference.
23) The method, as claimed in claim 20, wherein the one or more targets 50 are selected based on the plurality of 20 predetermined factors comprising priority and estimation (PSQR) of target, target type, speed and distance of the target, lethality levels and the like.
24) The method, as claimed in claim 20, wherein processing of 25 tactical data further comprises decoding and extracting the tactical data to obtain a unique identifier information of the one or more targets 50 assigned thereto by the peripheral object observation device 100, radial distance and azimuthal angle of the target 50 from the peripheral object observation device 100, perpendicular 30 height of the target 50 from the ground plane or base, heading
37
angle of the target 50, speed of the target 50, IFF (identification of friend or foe) code of target 50, WCO (weapon control orders) code of the target determined or computed by peripheral object observation device 100, advanced target position, orientation and relative velocity of the target 50. 5
25) The method, as claimed in claim 20, wherein receiving and processing of the trajectory data of the selected target 50 is performed in real time for interpolating the target trajectory path and prediction of future path, wherein the interpolation and prediction of 10 the target trajectory is based on historical data of track traced by the selected target 50 for a predetermined time period from instance of its detection by the peripheral target observation device 100.
15
26) The method, as claimed in claim 20, wherein a combination of tracking filters is selected from a group comprising Extended Kalman Filter (EKF), Kalman filter (KF), Particle filter or Bayesian filter and neural network model to predict position, target velocity, target trajectory and direction estimates of the selected target 50. 20
27) The method, as claimed in claim 20, wherein determination of 6dof pose correction data in real time comprises of computing translation and orientation offset between the TDR 200 and the HMD 500. 25
28) The method, as claimed in claim 20, wherein translation position of the HMD 500 with respect to the TDR 200 is estimated from readings obtained from global positioning system (GPS), inertial measurement units (IMUs) and real time kinematics (RTK) 30 GPS.
38
29) The method, as claimed in claim 20, further comprising:
calculating and estimating 6 dof pose of the HMD 500;
receiving the trajectory data and correction data in TDR’s 200 frame of reference from the computing module 250;
performing coordinate transformation of the tactical data, at the 5 processing module 560, by utilizing the 6dof pose correction data received from the computing module 250, HMD’s estimated 6 dof pose and the trajectory data for parallax correction and transforming the target trajectory from the TDR’s frame of reference to that of HMD’s 500 frame of reference. 10
30) The method, as claimed in claim 20, further comprising:
transmitting absolute position, 6dof pose partial correction data associated with a GNSS RTK rover from a GNSS RTK base station hosted at end of the computing module 250 to the processing module 15 560; and
computing position of TDR 200 with respect to the HMD 500 for parallax correction at the GNSS RTK rover hosted at end of the processing module 560 based on the received absolute position of the base station, the 6dof pose partial correction data of the rover and 20 GPS readings of the rover.
31) The method, as claimed in claim 20, wherein aligning the frame of reference of the weapon system 600 with that of the HMD’s 500 frame of reference comprises computing 6dof pose of 25 the weapon system 600 and viewing the emanated virtual reticle from centre of said weapon system 600, wherein the viewing of emanated virtual reticle serves as a virtual aid in manoeuvring and target aiming of the weapon system 600.
30
39
32) The method, as claimed in claim 20, wherein a ballistic parametric curve of the emanated virtual reticle is computed based on weapon type, initial thrust, mass of ammunition, maximum range of weapon system, air drag, gravity etc and in accordance manoeuvre the weapon system 600 for target engagement. 5
33) The method, as claimed in claim 20, wherein virtual target is displayed as a hollow blip, and wherein the virtual reticle provides a visual directional cue and situational cues for the target detection and locking. 10
34) The method, as claimed in claim 20, wherein the virtual target 50’ as overlaid over the real target 50 is displayed over the HMD 500 under all weather conditions.
15
35) The method, as claimed in claim 20, further comprising modulating opacity of the HMD 500 by an electrochromic optical element 590 in response to input received from one or more ambient light sensors to enable the HMD function under all weather conditions. 20

Documents

Application Documents

# Name Date
1 202121025002-STATEMENT OF UNDERTAKING (FORM 3) [04-06-2021(online)].pdf 2021-06-04
2 202121025002-PROVISIONAL SPECIFICATION [04-06-2021(online)].pdf 2021-06-04
3 202121025002-FORM FOR STARTUP [04-06-2021(online)].pdf 2021-06-04
4 202121025002-FORM FOR SMALL ENTITY(FORM-28) [04-06-2021(online)].pdf 2021-06-04
5 202121025002-FORM 1 [04-06-2021(online)].pdf 2021-06-04
6 202121025002-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [04-06-2021(online)].pdf 2021-06-04
7 202121025002-EVIDENCE FOR REGISTRATION UNDER SSI [04-06-2021(online)].pdf 2021-06-04
8 202121025002-DECLARATION OF INVENTORSHIP (FORM 5) [04-06-2021(online)].pdf 2021-06-04
9 202121025002-ENDORSEMENT BY INVENTORS [18-04-2022(online)].pdf 2022-04-18
10 202121025002-DRAWING [19-04-2022(online)].pdf 2022-04-19
11 202121025002-CORRESPONDENCE-OTHERS [19-04-2022(online)].pdf 2022-04-19
12 202121025002-COMPLETE SPECIFICATION [19-04-2022(online)].pdf 2022-04-19
13 202121025002-FORM FOR STARTUP [21-04-2022(online)].pdf 2022-04-21
14 202121025002-FORM 3 [21-04-2022(online)].pdf 2022-04-21
15 202121025002-RELEVANT DOCUMENTS [18-05-2022(online)].pdf 2022-05-18
16 202121025002-FORM 13 [18-05-2022(online)].pdf 2022-05-18
17 202121025002-Form 1 (Submitted on date of filing) [24-05-2022(online)].pdf 2022-05-24
18 202121025002-Covering Letter [24-05-2022(online)].pdf 2022-05-24
19 202121025002-CERTIFIED COPIES TRANSMISSION TO IB [24-05-2022(online)].pdf 2022-05-24
20 Abstract1.jpg 2022-05-30
21 202121025002-Defence-09-11-2022.pdf 2022-11-09
22 202121025002-DEFENCE REPLY-28-04-2023.pdf 2023-04-28
23 202121025002-FORM 13 [17-05-2024(online)].pdf 2024-05-17
24 202121025002-STARTUP [20-09-2024(online)].pdf 2024-09-20
25 202121025002-FORM28 [20-09-2024(online)].pdf 2024-09-20
26 202121025002-FORM 18A [20-09-2024(online)].pdf 2024-09-20
27 202121025002-FER.pdf 2024-11-05
28 202121025002-FORM 13 [18-04-2025(online)].pdf 2025-04-18
29 202121025002-FER_SER_REPLY [18-04-2025(online)].pdf 2025-04-18
30 202121025002-CORRESPONDENCE [18-04-2025(online)].pdf 2025-04-18
31 202121025002-US(14)-HearingNotice-(HearingDate-28-07-2025).pdf 2025-06-26
32 202121025002-US(14)-ExtendedHearingNotice-(HearingDate-11-08-2025)-1100.pdf 2025-07-28
33 202121025002-Correspondence to notify the Controller [28-07-2025(online)].pdf 2025-07-28
34 202121025002-Annexure [28-07-2025(online)].pdf 2025-07-28
35 202121025002-FORM 13 [29-07-2025(online)].pdf 2025-07-29
36 202121025002-FORM-26 [05-08-2025(online)].pdf 2025-08-05
37 202121025002-Written submissions and relevant documents [14-08-2025(online)].pdf 2025-08-14
38 202121025002-Annexure [14-08-2025(online)].pdf 2025-08-14
39 202121025002-Response to office action [13-10-2025(online)].pdf 2025-10-13

Search Strategy

1 SearchHistoryE_01-11-2024.pdf