Abstract: A system and method for providing virtual sight as aiming aid to engage targets using a head mounted device (HMD) is disclosed. Accordingly, a target observation device transmits tactical data to a target data receiver (TDR). The TDR processes the tactical data and transforms trajectory of selected target. A thermal camera is provisioned at end of weapon system to enhance target detection with high accuracy. The weapon system is configured with a modular tracker that provides 6dof pose of weapon with respect to HMD. Further, the gunner manoeuvring the weapon is wearing the HMD and standing on a platform embedded with cluster of infrared markers. This helps in computing 6DoF pose of the HMD and weapon in the global reference frame, thereby aligning virtual target blip with the real target over the HMD for accurate aiming and firing at the intended target.
DESC:FIELD OF THE INVENTION
Embodiments of the present invention relate to real-time tracking and engagement of aerial targets and more particularly to a system and a method for engaging and locking targets using a head mounted device, with fused inputs from peripheral observation device, electro-optical/infrared sensors for accurate weapon alignment under all weather conditions.
BACKGROUND OF THE INVENTION
Strong armed forces are one of the important aspects for the development and growth of any country. The armed forces participate in peacekeeping operations, military exercises and humanitarian relief missions. They also carry out more traditional tasks such as securing the borders. Armed forces use weapons for the benefit of civilians including law enforcement officers and members of the Army, Navy, Air Force, and Marines. In order to use a weapon effectively, a person must be able to accurately aim at a target. To accurately engage targets, the strike of a bullet must coincide with the aiming point (Point of Aim/Point of Impact) on the target. Over the years, various techniques and devices have been developed to help a person accurately aim a weapon.
One common approach is to mount a sight on the weapon to view the intended target. The sight must be zeroed before being used on the weapon. A true zeroing a firearm, such as a rifle, is the process of aligning and calibrating the optical sight on the weapon with the rifle so the user can accurately aim at the target from a set distance. It is based on the rationale that any individual differences in sight alignment for each individual will be eliminated by correcting the sight for the individual firing the weapon. Precisely, the user must calibrate the user's optical sights to their weapons to eliminate all eccentricities and ensure the user will hit the intended target. Typically, this is accomplished by adjusting the sights on the weapon such that when a round is fired from the weapon, it hits the aiming point within the margin of error of weapon. This zeroing process is one of the most critical elements of accurate target engagement. In order to aim accurately at a target, it is imperative for the sighting mechanism to be properly installed and adjusted on the gun. Having incorrect sight alignment leads to inaccurate firings that may eventually have negative training impacts besides incurring huge time, cost and ammunition loss.
One of the existing challenges in sighting in or zeroing a firearm is parallax correction in the sighting system. The goal is reducing the parallax and adjusting the sights so the projectile (e.g. bullet or shell) may be placed at a predictable impact position within the sight picture. The principle is to shift the line of aim so that it intersects the parabolic projectile trajectory at a designated point of reference, known as a zero, so the gun will repeatably hit where it aims at the distance of that "zero" point.
Conventional methods of zeroing a firearm are often expensive and time consuming where multiple rounds of live ammunition may be fired to achieve desired accuracy besides assuring receiving of detailed instructions from skilled professionals. This indeed is a time and cost intensive training, especially when a large group of armed personnel or defence forces need to be trained. Additionally, there is always a burden of employing a skilled professional adept at handling such sophisticated weapon systems at all times. More importantly, real challenge persists in situations of unfavourable weather conditions where visibility of aerial target is impaired for variable reasons-fog, smog, cloud, rain, snow etc. Locating and engaging threating targets under these disadvantageous weather conditions is critical and plays an instrumental role in factoring mission success.
Long-range radar systems provide early detection of aerial targets but are prone to positional error due to measurement noise, clutter, multipath, and mechanical drift. Visual/infrared tracking offers high angular accuracy but limited range. For accurate weapon aiming, the absolute 6-DoF (degrees-of-freedom) pose of both gunner and weapon in a globally-referenced frame is essential. Without precise synchronization and parallax correction between disparate sensor frames will not be achieved and the weapon may not be able to achieve bang on accuracy while target aiming.
In the background of foregoing limitations, there exists a need in the art for a system and method that can effectively align the weapon system with line of sight of operator and engage targets with acute precision in real time under all weather conditions. Preferably, some kind of virtual aid the form of cues for locating and actual engagement of the intended aerial target will be extremely vital for operators in dynamic situations like that of a battlefield. Therefore, an effective and viable system and method for displaying situational and directional pointers for target sighting, tracking, locking and engagement is desired that does not suffer from above mentioned deficiencies. This effectively compensate for several limitations witnessed in present state of art, do not suffer from above-mentioned deficiencies and provide an effective and viable solution.
OBJECT OF THE INVENTION
An object of the present invention is to provide a virtual sight as aiming and firing aid while using a weaponry system that can assist in engaging targets in real time under all weather conditions.
Another object of the present invention is to provide head mounted display that assists the operator in providing a virtual sight along with situational and directional cues for aiming and locking the intended aerial target with utmost accuracy.
Yet another object of the present invention is to provide a system to provide virtual aid for finding the aerial target and providing direct aid for actual engagement with the target by displaying virtual lasers and virtual crosshair pointers for target sighting, tracking, locking and engagement.
Yet another object of the present invention is to reduce aiming errors and provide bang on striking of aerial targets with enhanced visual and audio aid provided to operator while using his weaponry system to aim and strike at the target.
In another object of the present invention, significant reduction in training time and ammunition cost is achieved as operator of the weaponry system is provisioned with visual/audio cues for direct locking and striking the target.
In yet another embodiment of the present invention, the mixed reality based head mounted display facilitate depiction of entire surveillance picture along with enhanced analytics to the operator for right selection and neutralization of the probable aerial targets.
In still another embodiment, the advanced system and method enables accurate locating, tracking and engagement of the target with a combination of advanced radar system, head mounted display, thermal sensors and a modular tracker configured weaponry system.
In one other embodiment of present disclosure, an effective and sturdy solution is presented that assists in direct firing of sighted target with utmost precision by way of providing advanced cueing displayed on operator’s head mounted display along with a self-directed weaponry system.
SUMMARY OF THE INVENTION
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Accordingly, in accordance with first aspect of the present disclosure, a system for tracking and locking at least one aerial target, characterized in using a head mounted display (HMD) to provide a virtual cue is disclosed. The system herein comprises of an object observation device (OOD) configured to compute initial coordinates of the at least one aerial target. Next, a target data receiver (TDR) configured to receive and process the initial coordinates to obtain timestamped tactical data of the aerial target. A thermal camera mounted over a stabilized gimbal is provided at a distance from the TDR to perform cue-to-slew commands, predict future position of the aerial target such that the aerial target is centered in frame of the thermal camera, and the gimbal is rotated towards the predicted aerial target position in real-time.
Now, the head mounted display (HMD) that is worn by a gunner stands on a platform, wherein the platform is embedded with an array of infrared markers to measure 6DoF position of the HMD and a weapon operated by the gunner, and wherein local frame of the platform is aligned with a global frame. Finally, the weapon is configured with an infrared tracker with a 6DoF position of the weapon aligned with the HMD in the global reference frame. Importantly, this enables the virtual cue of the aerial target to be rendered on the HMD in the real-time for direct aiming of the aerial target by the weapon.
In another aspect of the disclosure, a method for tracking and locking at least one aerial target, characterized in using a head mounted display (HMD) to provide a virtual cue is disclosed. The method herein comprises steps of: computing initial coordinates of the at least one aerial target using a target observation device (TOD); processing the initial coordinates to obtain timestamped tactical data of the aerial target at a target data receiver (TDR); performing cue-to-slew commands at a thermal camera mounted over a stabilized gimbal, predicting future position of the aerial target such that the aerial target is centered in frame of the thermal camera and rotating the gimbal towards the predicted aerial target position in real-time. This is followed by measuring 6DoF position of the HMD worn by a gunner and a weapon by an array of infrared markers that are embedded in a platform, wherein local frame of the platform is aligned with a global frame. Now, 6DoF position of the weapon aligned with the HMD in the global reference frame is measured, and finally the virtual cue of the aerial target is rendered over the HMD in the real-time for direct aiming of the aerial target by the weapon.
BRIEF DESCRIPTION OF THE DRAWINGS
So that the manner in which the above recited features of the present invention can be understood in detail, a more particular to the description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, the invention may admit to other equally effective embodiments.
These and other features, benefits and advantages of the present invention will become apparent by reference to the following text figure, with like reference numbers referring to like structures across the views, wherein:
Fig. 1 illustrates a block diagram of the system providing virtual sight as aiming aid to a weapon using head mounted device, in accordance with one exemplary embodiment of the present invention.
DETAILED DESCRIPTION
The following description is provided to enable an expert in the field to embody and use the invention. Various modifications to the embodiments shown will be immediately obvious to experts and the generic principles described herein could be applied to other embodiments and applications without, however, leaving the scope of the present invention.
While the present invention is described herein by way of example using embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments of drawing or drawings described and are not intended to represent the scale of the various components. Further, some components that may form a part of the invention may not be illustrated in certain figures, for ease of illustration, and such omissions do not limit the embodiments outlined in any way. It should be understood that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the scope of the present invention as defined by the appended claims. As used throughout this description, the word "may" be used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense, (i.e., meaning must). Further, the words "a" or "an" mean "at least one” and the word “plurality” means “one or more” unless otherwise mentioned. Furthermore, the terminology and phraseology used herein is solely used for descriptive purposes and should not be construed as limiting in scope. Language such as "including," "comprising," "having," "containing," or "involving," and variations thereof, is intended to be broad and encompass the subject matter listed thereafter, equivalents, and additional subject matter not recited, and is not intended to exclude other additives, components, integers or steps. Likewise, the term "comprising" is considered synonymous with the terms "including" or "containing" for applicable legal purposes. Any discussion of documents, acts, materials, devices, articles, and the like are included in the specification solely for the purpose of providing a context for the present invention. It is not suggested or represented that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention.
In this disclosure, whenever a composition or an element or a group of elements is preceded with the transitional phrase “comprising”, it is understood that we also contemplate the same composition, element or group of elements with transitional phrases “consisting of”, “consisting”, “selected from the group of consisting of, “including”, or “is” preceding the recitation of the composition, element or group of elements and vice versa.
The present invention is described hereinafter by various embodiments with reference to the accompanying drawings, wherein reference numerals used in the accompanying drawing correspond to the like elements throughout the description. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art. In the following detailed description, numeric values and ranges are provided for various aspects of the implementations described. These values and ranges are to be treated as examples only and are not intended to limit the scope of the claims. In addition, a number of materials are identified as suitable for various facets of the implementations. These materials are to be treated as exemplary and are not intended to limit the scope of the invention.
In accordance with one general embodiment of present disclosure, a system and method for tracking and locking one or more aerial targets, characterized in utilizing a wearable mixed reality-based head mounted display (HMD) that provides virtual sight as an aiming aid to a weaponry system. In one of the exemplary embodiments of the present disclosure, a system and method that provides virtual sight as an aiming aid to a weaponry system is proposed that employs an external, smart wearable head mounted display to enable the operator of the weaponry system or wearer of the HMD achieve the intended purpose of accurate shot placement. Accordingly, a mixed reality-based HMD is built with capabilities of providing virtual aid in the form of vital cues and enabling aerial target engagement and bang on firing with acute exactness.
Here, the bang on firing of target refers to precision and accuracy of proposed weaponry system, where it successfully hits or locks on to the aerial target at the precise moment intended, ensuring exact accuracy in engaging the target and achieving strike on point. For achieving such a bang on effect, acquiring timing and spatial location of target are of paramount importance. The disclosed system operates as a multi-sensor, cue-and-track engagement mechanism for aerial targets, combining long-range target observation device (TOD) cueing with weapon-end passive thermal imaging to achieve high-accuracy 3D target localization and engagement, even in degraded visual environments. While the long-range TOD provides the initial detection and "cueing" of potential aerial targets, supplying location data and tracking objects over large distances, a weapon-end passive thermal imaging sensor takes over for closer accurate tracking.
Referring now to Fig. 1, a block diagram of proposed system 1000 is presented, wherein the system 1000 comprises of a peripheral target observation device (TOD) 100, at least one aerial target (to be aimed at) 50, a target data receiver 125, a thermal camera 200 fixedly positioned on a gimbal stand 250, a smart mixed reality-based head mounted display (HMD) 500 worn by the gunner standing on a platform 600 embedded with an array of infrared markers 650 and a weapon system 700 configured with an infrared tracker 750 that is operable by the gunner who is aiming the weapon 700 towards the stationary or moving aerial target 50 for final accurate shot.
In accordance with one exemplary embodiment, the surveillance grade external peripheral target observation device such as radar/lidar 100 computes the initial target coordinates in a georeferenced frame along with bearing (indicative of direction to aerial target from radar site) and elevation (used to calculate target altitude when combined with distance and range). Now, the initial coordinates are received from the TOD 100 by the Target Data Receiver (TDR) 125. At the TDR 125, the initial coordinates are processed along with range (?), azimuth (a), elevation (e), and track IDs of aerial target 50 to obtain timestamped tactical data. In one exemplary embodiment, tactical data is timestamped at source using GPS Pulse-Per-Second (PPS) synchronized clock via Precision Time Protocol (PTP). While the PPS gives a very accurate pulse aligned to GPS device at TDR 125, PTP distributes a higher precision clock over the network such that all devices/sensors can share a common global time-base with sub-microsecond to sub-millisecond accuracy.
The external peripheral TOD 100 typically produce measurement noise characterized by random fluctuations from electronic components, which makes them susceptible to jamming, low-level detection capability which makes it difficult for them to detect low-flying targets, especially in cluttered environments, such as mountains or urban areas, weather sensitivity etc. Further, issues related to cost and complexity, multiplicative noise like speckle in imaging radar, and non-Gaussian artifacts due to environmental factors or signal processing---including returns and interference from background terrain, atmospheric conditions, or other external sources, requires cleaning of the contaminated output. Accordingly, a computing module 150 is provided at the TDR 125 that applies particle filtering, which is particularly useful for tracking in radar when noise is non-linear or non-Gaussian, such as in manoeuvring target scenarios or when the signal statistics are unknown.
In one exemplary embodiment, target trajectory is estimated using particle filter approach, which extrapolates and propagates target states (position, velocity, acceleration) using a motion model. It achieves this by representing possible future states as a cloud of weighted particles that collectively approximate the probability distribution of the target's potential positions over time. Each particle represents a hypothesized state (e.g., position, velocity) of the target 50, and is propagated forward using a motion model (e.g., constant velocity, manoeuvring). New measurements or readings from peripheral target observation device 100 are used to update the weights of the particles according to how well each one explains the observed data, focusing the particle cloud around the most likely trajectories.
Upon receiving new radar measurements, the particle filter (PF) updates particle weights based on likelihood functions derived from radar measurement covariance. If measurement deviation low, the PF blends TOD data with predicted state and updates it. On the other hand, if the measurement deviation is large, but statistically plausible, it will rely on further position reading from thermal sensor 200, and backpropagates the feedback to the computing module 150 for radar updation. By iteratively predicting and updating, the filter provides a probabilistic estimate (mean or mode) of the trajectory and can extrapolate the target’s future path based on recent observations and the underlying dynamics.
In specific embodiments, for e.g. military or warfare applications such an unpredictable nature of target displacement by a mile or more may prove disastrous if not effectively monitored and tracked. Hence, continuous tracking and tracing of these targets 50 in relatively short time frame (e.g. seconds) becomes quintessential in such war like scenarios. As the aerial target 50 are searched by a search and surveillance TOD 100, the range of detection is limited between 90-120 km. It provides a 3D view (range, azimuth and elevation) of the target 50, making it a critical initial tool for tracking aerial threats and their effective engagement.
Thus, the external peripheral target observation device 100 receives a reflected wave of the irradiated radar wave from the aerial target 50. In one preferred embodiment, the peripheral target observation device (radar) 100 repeatedly observes a relative position of an aerial target 50 within an enhanced operational picture and transmits the tactical data to TDR 125, which further communicates data to the thermal camera 200 over a wired or wireless connection. In one illustration, the peripheral target observation device 100 provides the coordinates of the aerial target 50 in the spherical coordinate system in the peripheral object observation device’s 100 frame of reference.
In one working embodiment, the tactical data receiver (TDR) 125 is a fixed, surveyed station co-located with or networked to the peripheral target observation device 100. It serves as the GNSS/GPS base station in an RTK (Real-Time Kinematic) configuration, providing high-accuracy, globally-referenced coordinates for itself. Since the peripheral target observation device (OOD) 100 and TDR 125 are spatially separated, they observe the same target 50 from different viewpoints, causing geometric parallax. Here, the target position error is generally given as:
Target Position Error = Baseline × (Target_Range / Sensor_Separation), where
- Baseline = Distance between peripheral target observation device 100 and TDR 125
- Target_Range = Distance from sensors to target 50
- Sensor_Separation = Physical separation between peripheral target observation device 100 and TDR 125.
At the TDR 125, GNSS antenna is placed at a precisely surveyed location in ENU (East, North, Up) coordinates. The TDR 125 generates RTK corrections and transmits them to all rover GNSS units in the system 1000 over a low-latency link. In one working embodiment, the solution requires following steps:
a) Establishing a precise baseline vector, where
Baseline_Vector = TDR_Position_RTK – Target Observation Device_Position_RTK
b) Performing target observation device target vector transformation, where
Target_Vector_TOD = [Range × cos(El) × sin(Az)]
[Range × cos(El) × cos(Az)]
[Range × sin(El)
Where:
- Range = TOD measured slant range
- Az = TOD azimuth (from radar's local north)
- El = TOD elevation angle
c) Performing parallax correction, where
Target_Position_Corrected = TOD_Position_RTK + Target_Vector_TOD + Parallax_Offset; where
Parallax_Offset = f(Baseline_Vector, Target_Vector_Radar, Target_Range)
In accordance with one example embodiment, at the computing module 150, at least one target 50 of the at least one target is selected based on a plurality of factors, including, but not limited to priority and estimation (PSQR) of target, target launch direction, target type, speed and distance of the target, target trajectory, lethality levels and the like. The tactical data is processed and decoded to extract a unique identifier information of the selected target 50 assigned thereto by peripheral target observation device 100, radial distance and azimuthal angle of the target 50 from the peripheral target observation device 100, perpendicular height of the target 50 from the ground plane or base, heading angle of the target 50, speed of the target 50, IFF (identification of friend or foe) code of target 50, WCO (weapon control orders) code of the target determined/computed by peripheral target observation device 100, advanced target position, orientation and relative velocity of the target 50 approaching the firing zone. In one example embodiment, the tactical data is decoded from a hexadecimal byte string to extract the critical parametric values (as stated above) with respect to the (selected) target 50.
In one working embodiment, the tactical data of the target 50 along with other critical data received at the computing module 150 is processed, decoded and wirelessly transmitted to a thermal camera (200), from where it is further transmitted to smart mixed reality (MR) based head mounted display (HMD) 500 from where the gunner designates the virtual target. In an embodiment, the tracking information with respect to the one or more intended or locked targets 50 is continuously received at the computing module 150 in real time at frequency of 30-60Hz and an intermittent gap of approx. 1-3 secs.
In next working embodiment, the parallax corrected 3D tactical data is transmitted from TDR 125 to a thermal camera 200 mounted on a two-axis stabilized gimbal 250 that receives cue-to-slew commands: [azimuth (horizontal) / elevation (vertical)], representing the predicted direction of the target 50. The corrected tactical data is received at the thermal camera 200 with covariance and timing. The two-axis stabilization of the gimble 250 ensures that the camera 200 remains steady and vibration-free during movement and operation, maintaining accurate tracking and imaging even on moving platforms or in turbulent conditions. The below two equations describe how the gimbal's 250 horizontal (azimuth) and vertical (elevation) pointing angles are calculated based on the TOD’s solution and corrections or offsets:
Azgimbal = AzTOD_to_weapon + ?Az (1)
Elgimbal = ElTOD_to_weapon + ?El (2)
(Here, ?Az and ?El are correction factors or offsets accounting for mechanical misalignments, stabilization errors, or calibration adjustments in the gimbal mechanism). This ensures that even when the target 50 is beyond the thermal camera’s 200 initial visual detection range, the camera 200 is already aimed at the predicted location, minimizing search time. As the target 50 approaches and enters the thermal detection envelope (often >10 km for large aerial objects with strong IR signatures), the thermal camera’s 200 infrared focal plane array (FPA) begins receiving a resolvable signal. Once locked, the gimbal’s 250 high-resolution encoders provide precise line-of-sight (LOS) angles to the target 50.
In one exemplary embodiment, the PTZ camera 200 (Pan-Tilt-Zoom) interpolates its own RTK pose to the TOD timestamp such that the they operate synchronously. It then computes its optical centre to determine the exact 3D location of the thermal camera’s 200 imaging origin (the point from which its pixels’ rays emanate) in the global coordinate frame at a given moment, and refines the target position by correcting for parallax between the TDR’s GPS and its own GPS, using ray-ray intersection or point fusion if additional ranging is available.
Since the TDR 125 and the thermal camera 200 are often mounted at physically separated locations, and because their optical/radar boresights and GNSS antenna phase centres are not co-located, their extrinsic calibration parameters must be accounted to ensure accurate target geolocation and cue-to-slew alignment. These extrinsic parameters include: a) Lever-arm offset and b) Rotation offsets. While Lever-arm offset refers to 3D translation vector from the reference point of TDR 125 to the reference point of thermal camera 200, the rotation offset is the orientation difference between each sensor’s local coordinate frame and the platform body frame. Using the lever-arm and rotation offsets, the data is transformed from TDR’s 125 frame of reference into the thermal camera’s 200 global reference frame (via the platform’s global pose from GNSS/RTK), compensating for the physical displacement between the sensors. This ensures that when cue-to-slew commands are issued to the thermal camera 200, the gimbal 250 points toward the true target location, not an uncorrected, offset estimate.
From the moment, the TOD 100 detects the target 50 to the moment the PTZ 200 actually moves, there is end-to-end latency, and if the gimbal 250 operates using stale target coordinates, a fast-moving aerial target 50 will no longer be at that position when the gimbal 250 finishes slewing. In next noteworthy embodiment, future prediction is done which can help in firing the weapon at a future position of the target 50 for high firing accuracy. It is well understood that aerial target 50 may move at high speeds and change direction, and it is important that projectile flight time is computed for accurate firing. Hence, the refined 3D target is predicted forward to the gimbal actuation time to compensate for latency. Now, the predicted 3D position is converted into desired pan/tilt angles in the PTZ’s 200 local frame, which is then used to drive the gimbal 250 with feedforward and closed-loop image corrections to the predicted pan/tilt angles until the target 50 is centered. Once the PTZ’s 200 onboard thermal image processing detects the target 50 in the frame, it applies fine motor adjustments to keep the target centered, compensating for small prediction errors, wind-up, or target manoeuvres.
Thus, upon cueing, the gimbal 250 slews rapidly and dynamically toward the predicted target direction and locks to keep the target 50 centered in the image to compensate for both platform motion and target manoeuvring. High-speed, precision motors in the gimbal 250 minimize response time and overshoot, quickly centering the field of view close to the predicted location. Once in the general vicinity, the thermal camera 200 starts acquiring imagery, using any of the known object tracking algorithms (including AI and vision-based methods) to find and centre the target 50 in real time. In accordance with one working embodiment, the thermal camera 200 searches for thermal signature of the target 50, matching it against predicted kinematic properties (velocity, heading). This enables a “lock-on” feature, where the gimbal 250 maintains constant framing of the target 50, even as it moves unpredictably or rapidly.
In one exemplary embodiment, a co-boresighted laser rangefinder (LRF) 275, which is optically aligned with the thermal camera 200, provides instantaneous range measurements directly to the centre of the camera image. Accurate range determination enables real-time target geo-location (latitude, longitude, altitude) and supports advanced fire control, tracking, or guidance functions. The LRF 275 can typically measure over distances of 1–3 km or beyond, depending on the model and target reflectivity. Range measurement is synchronized with video, facilitating precise distance tagging in captured imagery and aiding in multi-sensor data fusion. Advantageously, the thermal cameras 200 detect infrared radiation (heat) emitted by target 50, which allows them to work in complete darkness or adverse weather conditions, thus compensating for limitations of search radar/TOD 100 towards changing weather conditions.
Once the thermal camera’s 200 GPS-enabled gimbal 250 completes its parallax correction with respect to the TDR 125, a refined 3D target position is obtained in the global world frame (e.g., ENU or NED coordinates). This georeferenced target position is then transmitted to the gunner’s end. However, the gunner doesn’t see the world in global coordinates — he sees from his own eyes, i.e., from the HMD’s 500 local frame of reference. So, in order to make the target 50 appear in the right place in the HMD 500, transformation is performed. At the gunner’s location, the target 50 must be rendered from his specific vantage point with respect to the gunner’s local frame, which requires an indirect parallax correction — i.e., transforming the global target coordinates into the gunner’s local frame of reference using his own GPS-derived position and orientation that is corrected by the east aligned platform (as discussed later). This is called indirect parallax correction as global target location is chosen and “re-projected” it into the gunner’s personal view frame using his own GPS and orientation.
In accordance with next practical embodiment, the system 1000 offers solution for obtaining accurate weapon position and orientation detection with reference to HMD 500 as often with contemporary solutions the relative tracking between the weapon 600 and the HMD 500 fails in outdoor combat environments. As will be generally agreed, the targeting overlay in the HMD 500 must perfectly align with the actual trajectory of the weapon barrel or optical sight. This requires knowing the weapon’s 700 6DoF pose relative to the HMD 500 with very high accuracy — errors as small as 1–2 milliradians (˜ a few mm at 100 m) can cause significant aiming deviations. The weapon 700 contains ferromagnetic parts (steel barrels, mounts etc.), which can cause local magnetic field distortions, thereby leading to significant yaw heading drift when using magnetometers in IMU-based trackers, impacting IMU readings. Thus, inertial orientation estimation without magnetic correction suffers from accumulated drift, making it unusable for long-term aiming.
In order to achieve improved accuracy in target 50 engagement and bang on demolition, two important aspects need to be considered:
a) Target 50 absolute position detection
b) 6DoF Weapon 700 and HMD 500 Position and Orientation detection with reference to the global frame
To precisely establish the gunner’s 6DoF pose (position + orientation) and to track weapon 700 movement in real time, the system 1000 employs an infrared (IR) optical tracking system. Accordingly, in next working embodiment, the gunner stands on a fixed platform 600 embedded with precisely surveyed infrared (IR) markers 650 arranged in a known geometric pattern. This marker array 650 acts as a local tracking reference for optical tracking cameras or sensors, ensuring high-fidelity 6DoF measurements. In one exemplary embodiment, the ground platform 600 is aligned with the east direction (true east) so that its local frame can be easily mapped to the global ENU frame.
Precisely, the ground platform 600 on which the gunner stands is not just a passive surface — it is part of the local coordinate system used for optical tracking of the HMD 500 and weapon-mounted IR markers 750. In order for the optical tracking data to be directly usable in the global world frame (e.g., ENU — East, North, Up), the local coordinate axes of the platform 600 must be precisely aligned with the global axes. Basically, the IR tracking system 650 measures positions of the HMD 500 and weapon 700 in the platform’s 600 local frame. The global ENU frame is defined as:
X-axis ? East
Y-axis ? North
Z-axis ? Up (aligned with local gravity vector)
So, if the platform’s 600 local X-axis is not aligned with true east, any measurement from the optical tracker will need a rotation correction before it can be compared with or fused into global coordinates. Thus, the platform 600 is physically oriented during installation so that its local X-axis points toward true east. Once aligned, the platform’s 600 local frame is parallel to the global ENU frame, which simplifies the mapping of all HMD 500 and weapon IR tracker 750 positions into the global world frame and ensures seamless integration with other geo-referenced sensor data.
Further, the head-mounted display (HMD) 500 worn by the gunner has an IR tracker module 550 that detects and/or is detected by the ground marker array 650. This allows determination of the HMD’s 500 precise 6DoF pose within the platform 600 reference frame. In one working embodiment, using the known 3D layout of IR marker cluster 650 and 2D pixel coordinates of the observed IR points, perspective-n-point (PnP) approach may be utilized to receiver 6DoF pose of the HMD 500. Now that the platform’s 650 local reference frame is already aligned with the global ENU (east-north-up) frame using GPS/RTK, the computed HMD 500 pose is immediately expressed in world coordinates. This means the gunner’s head position and orientation are globally referenced without drift.
In the same vein, the weapon 700 is equipped with its own IR tracker 750, also referenced to the same platform’s infrared marker array 650. Since both the HMD 500 and weapon 700 are tracked in a common local coordinate frame, the relative pose between the gunner’s sight line and the weapon barrel can be continuously calculated. Thus, using the known 3-D coordinates of the platform markers 650 and the 2-D detections in the weapon IR tracker’s camera 750, a Perspective-n-Point (PnP) solver computes the weapon’s 700 full 6-DoF pose relative to the platform 600. That local pose is then transformed to the global ENU world frame using the platform survey (translation) and east-alignment (rotation).
In accordance with one noteworthy embodiment, the local ground reference frame (from the IR marker array 750) is rotated and translated into the global world frame by using the surveyed platform location (from GPS/RTK). The gunner’s GPS location provides the translation into global coordinates, which ensures the system 1000 knows where the gunner is standing in global space, allowing the refined 3D target position to be rendered correctly from the gunner’s vantage point in the HMD 500. Further, as explained, the east-aligned platform 600 provides the heading reference. Thus, after applying both translation (surveyed GPS) and rotation (east alignment), the HMD 500 and weapon 700 poses are fully expressed in the global ENU frame. Also, the gunner’s head orientation and weapon aim vector, originally tracked in the local IR marker frame, are now directly comparable to the global target coordinates, thereby completing the 6DoF transformation into the world frame, and enabling accurate overlay of the virtual target blip in the HMD 500 and precise alignment of the weapon barrel with the georeferenced target.
In next exemplary embodiment, tactical information such as 3dof position of aerial target 50 is to be rendered over the mixed reality-based HMD 500 in the form of a hollow blip at all times. Thus, a virtual target is spawned in HMD 500 along with the visual directional cues to identify and locate the spawned target 50 in the airspace. However, in one exemplary embodiment, for wearer of HMD 500 who may also be the operator of weapon system 700, the virtual target viewable through HMD 500 may have to be aligned for accurate aiming and shooting.
With the gunner’s 6DoF pose and the refined 3D target position both in the same global frame (as obtained above), the system 1000 transforms the target 50 coordinates into the gunner’s HMD 500 frame of reference. This enables accurate rendering of a virtual target blip in the HMD display 500 so that it appears in the correct position relative to the gunner’s view. This configuration ensures that even though the target 50 was initially detected and refined at a remote sensor site, the gunner sees it from his exact vantage point with no perceptible spatial error, and both the weapon 700 and HMD 500 tracking are inherently aligned for precise aiming.
In another significant embodiment, in order to achieve stable and accurate 6DoF positioning of the HMD (500) and weapon (700) in the global reference frame, an Extended Kalman Filter (EKF) is deployed. The EKF fuses multiple asynchronous, noisy sensor measurements into a single, smooth state estimate. The state vector for each entity (gunner and weapon) is modeled to capture both pose and dynamics. This formulation allows the EKF to model both absolute pose and sensor error dynamics. The IMU (in the HMD 500 and weapon 700) gives fast updates about movement and rotation, but it drifts over time. The GPS/RTK provides global position fixes, but these can be jumpy and noisy. The IR marker array 650 on the ground platform 600 provides very precise local tracking of the gunner’s head and weapon in 6DoF, but only relative to the platform 600. By combining all of these, the EKF smooths out noise, removes GPS jitters, and corrects for IMU drift.
The result is a stable, globally aligned 6DoF position and orientation of both the HMD 500 and the weapon 700. This ensures that (a) the virtual blip in the HMD 500 appears exactly where the real target 50 is; (b) the weapon’s 700 aim is correctly aligned with the global target trajectory. In one general embodiment, aligning the target 50 with that of weapon 700 is important for effective locking of target, and to achieve the same, zeroing in of weapon 700 is important. Weapon Zeroing in is one of the essential principles underpinning the effective locking and involves setting and calibrating the sights to enhance firing accuracy. This zeroing process is a critical element for accurate target engagement. To accurately engage targets, the strike of a bullet must coincide with the aiming point (Point of Aim/Point of Impact) on the target 50. In general, in order to effectively use a weapon 700, it is imperative for the weapon 700 to be properly calibrated and zeroed in with the sight for securing enhanced functional and operational survivability in a dynamic, hostile situation such as a battlefield.
Finally, the HMD 500 combines these data sources, updating the user’s display so the virtual blip (symbol or marker) is rendered at exactly the same position as the real detected target 50—even as the gunner moves or the target 50 shifts in space. This overlay remains locked in place visually, with no drift or lag, giving perfect alignment for observation, engagement, and cueing of weapons or sensors, and bang on overlapping of the aerial target 50 and virtual target blip.
In one final embodiment, a method (2000) for tracking and locking at least one aerial target (50), characterized in using a head mounted display (HMD) (500) to provide a virtual cue is disclosed. The method (2000) comprises steps of: computing initial coordinates of the at least one aerial target (50) using a target observation device (TOD) (100); processing the initial coordinates to obtain timestamped tactical data of the aerial target (50) at a target data receiver (TDR) (125); performing cue-to-slew commands at a thermal camera (200) mounted over a stabilized gimbal (250), predicting future position of the aerial target (50) such that the aerial target (50) is centered in frame of the thermal camera (200) and rotating the gimbal (250) towards the predicted aerial target position (50) in real-time; measuring 6DoF position of the HMD (500) worn by a gunner and a weapon (700) by an array of infrared markers (650) that are embedded in a platform (600), wherein local frame of the platform (600) is aligned with a global frame; measuring 6DoF position of the weapon (700) aligned with the HMD (500) in the global reference frame; and rendering the virtual cue of the aerial target (50) over the HMD (500) in the real-time for direct aiming of the aerial target (50) by the weapon (700).
In one of concluding remarks, the attempt is to render a complete surveillance picture and precise position of the aerial target 50 as virtual targets (as explained later) over the HMD 500 for absolute aiming and firing. Hence, the weapon system 700 configured at the gunner’s end is manoeuvred such that a virtual reticle emanates from weapon system 700 and is made viewable via HMD 500 to the gunner. This virtual reticle is eventually made to coincide with a virtual target rendered over the HMD 500 for accurate aiming and locking of the aerial target 50, in a manner discussed above.
In a preferred embodiment, the present disclosure attempts to provide a solution by way of leveraging capabilities of 3D TOD 100, TDR 125 with a thermal camera 200 and a head mounted display 500 where the distantly located aerial target 50 can be relayed as a virtual blip in 3D space over the HMD display 500. As a human can clearly see an aerial target clearly at only when it comes close to 7-8km from observer’s eye, virtual blip displayed on HMD 500 at earliest observation by 3D TOD 100 gives enough time for preparedness of strike and directing the weapon head towards the coming aerial target 50. As discussed above, the weapon barrel is fitted with an IR marker cluster also calibrated to the global frame that enables viewing of virtual weapon axis over the HMD 500 for precise aiming. Thus, even if the target 50 is beyond visual range of human eye, then also based on virtual target blip and virtual weapon axis provided by novel configuration of 3D TOD 100, TDR 125, thermal imaging 200, HMD 500 detected by ground IR markers 650 and IR marker 750 configured weapon 700, the exact aiming can be successfully accomplished.
Further, as will be acknowledged by those operating at field level, hit rate in such dynamic war like situations is not very optimal. However, with the virtual aid rendered over the HMD 500 using the peripheral target observation device 100 to transmit the extrapolated tactical data, thermal camera 200 to execute cue to slew command, and cluster of IR markers 650 configured to capture 6DoF position of the HMD 500 and the weapon end 700, the target’s 50 probable position can be predicted few milliseconds ahead of time and virtual blip can be made to overlay with the reticle emanating from the weapon system 700 that ensures directed aiming and hitting of the target 50.
In accordance with an embodiment, a memory unit configured to store machine-readable instructions. The machine-readable instructions may be loaded into the memory unit from a non-transitory machine-readable medium, such as, but not limited to, CD-ROMs, DVD-ROMs and Flash Drives. Alternately, the machine-readable instructions may be loaded in a form of a computer software program into the memory unit. The memory unit in that manner may be selected from a group comprising EPROM, EEPROM and Flash memory. Further, the computing module 150 includes a processor operably connected with the memory unit. In various embodiments, the processor is one of, but not limited to, a general-purpose processor, an application specific integrated circuit (ASIC) and a field-programmable gate array (FPGA).
In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, written in a programming language, such as, for example, Java, C, or assembly. One or more software instructions in the modules may be embedded in firmware, such as an EPROM. It will be appreciated that modules may comprised connected logic units, such as gates and flip-flops, and may comprise programmable units, such as programmable gate arrays or processors. The modules described herein may be implemented as either software and/or hardware modules and may be stored in any type of computer-readable medium or other computer storage device.
Further, while one or more operations have been described as being performed by or otherwise related to certain modules, devices or entities, the operations may be performed by or otherwise related to any module, device or entity. As such, any function or operation that has been described as being performed by a module could alternatively be performed by a different server, by the cloud computing platform, or a combination thereof. It should be understood that the techniques of the present disclosure might be implemented using a variety of technologies. For example, the methods described herein may be implemented by a series of computer executable instructions residing on a suitable computer readable medium. Suitable computer readable media may include volatile (e.g., RAM) and/or non-volatile (e.g., ROM, disk) memory, carrier waves and transmission media. Exemplary carrier waves may take the form of electrical, electromagnetic or optical signals conveying digital data steams along a local network or a publicly accessible network such as the Internet.
It should also be understood that, unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "controlling" or "obtaining" or "computing" or "storing" or "receiving" or "determining" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that processes and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Various modifications to these embodiments are apparent to those skilled in the art from the description and the accompanying drawings. The principles associated with the various embodiments described herein may be applied to other embodiments. Therefore, the description is not intended to be limited to the embodiments shown along with the accompanying drawings but is to be providing broadest scope of consistent with the principles and the novel and inventive features disclosed or suggested herein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention.
,CLAIMS:We Claim:
1) A system (1000) for tracking and locking at least one aerial target (50), characterized in using a head mounted display (HMD) (500) to provide a virtual cue, wherein the system (1000) comprises of:
a target observation device (TOD) (100) configured to compute initial coordinates of the at least one aerial target (50);
a target data receiver (TDR) (125) configured to receive and process the initial coordinates to obtain timestamped tactical data of the aerial target (50);
a thermal camera (200) mounted over a stabilized gimbal (250) and configured to perform cue-to-slew commands, predict future position of the aerial target (50) such that the aerial target (50) is centered in frame of the thermal camera (200) and the gimbal (250) is rotated towards the predicted aerial target position (50) in real-time;
the head mounted display (HMD) (500) worn by a gunner standing on a platform (600), wherein the platform (600) is embedded with an array of infrared markers (650) to measure 6DoF position of the HMD (500) and a weapon (700) operated by the gunner, and wherein local frame of the platform (600) is aligned with a global frame;
the weapon (700) configured with an infrared tracker (750) to measure a 6DoF position of the weapon (700) aligned with the HMD (500) in the global reference frame; and
wherein the virtual cue of the aerial target (50) is rendered on the HMD (500) in the real-time for direct aiming of the aerial target (50) by the weapon (700).
2) The system (1000), as claimed in claim 1, wherein the target observation device (TOD) (100) is a radar or lidar that is configured to compute the initial coordinates in a georeferenced frame along with bearing and elevation readings of the aerial target (50).
3) The system (1000), as claimed in claim 1, wherein the TDR (125) is configured to obtain tactical data timestamped at the TOD (100) using GPS Pulse-Per-Second (PPS) synchronized clock via Precision Time Protocol (PTP).
4) The system (1000), as claimed in claim 1, further comprising a computing module (150) configured to eliminate noise in the initial coordinates received from the TOD (100) and extrapolate trajectory of the aerial target (50) using a particle filter approach.
5) The system (1000), as claimed in claim 4, wherein the computing module (150) is configured to update the trajectory of the aerial target (50) based on observations recorded from the thermal camera (200).
6) The system (1000), as claimed in claim 4, wherein the computing module (150) is configured to select the at least one aerial target (50) based on a plurality of factors, including, though not limited to priority and estimation of target, target launch direction, target type, speed and distance of the target, target trajectory, and/or lethality levels.
7) The system (1000), as claimed in claim 1, wherein the TOD (100) and TDR (125) are spatially separated and corrected for geometric parallax in steps of:
establishing a baseline vector from position of the TDR (125) and TOD (100);
performing TOD (100) target vector transformation from range, azimuth and elevation of the TOD (100); and
performing parallax correction using the baseline vector and the target vector transformation.
8) The system (1000), as claimed in claim 1, wherein the stabilized gimbal (250) maintains accurate, stable and vibration free tracking of the aerial target (50) from azimuth and elevation angles of the TOD (100) and correction offsets of the gimbal (250).
9) The system (1000), as claimed in claim 1, wherein the thermal camera (200) is configured to interpolate Real-Time Kinematics (RTK) pose thereof to timestamp of the TOD (100) and compute optical centre to determine the absolute position of the aerial target (50) in the global reference frame.
10) The system (1000), as claimed in claim 1, wherein the TDR (125) and the thermal camera (200) are distantly located and parallax corrected by applying lever-arm offset and rotation offset for accurate performance of the cue-to-slew command.
11) The system (1000), as claimed in claim 1, wherein the thermal camera (200) is optically aligned with a laser rangefinder (LRF) (275), wherein the LRF (275) is configured to provide instantaneous range measurement to enable locating the aerial target (50) in real time.
12) The system (1000), as claimed in claim 1, wherein the platform (600) is aligned with east direction such that local frame of the platform (600) is mapped to the global frame.
13) The system (1000), as claimed in claim 12, wherein the HMD (500) and the weapon (700) are expressed in the global reference frame by applying translation obtained from position of the gunner and rotation from the east aligned platform (600).
14) The system (1000), as claimed in claim 1, wherein the HMD (500) is provided with an IR tracker module (550) and configured to determine 6DoF pose of the HMD (500) from the platform’s (600) reference frame using perspective-n-point (PnP) approach.
15) The system (1000), as claimed in claim 1, wherein the weapon (700) configured with the infrared marker (750) is aligned with the platform’s (600) reference frame and the 6DoF position of the weapon (700) is measured using perspective-n-point (PnP) approach.
16) The system (1000), as claimed in claim 1, wherein the 6DoF positioning of the HMD (500) and the weapon (700) is stabilized using an Extended Kalman Filter (EKF).
17) A method (2000) for tracking and locking at least one aerial target (50), characterized in using a head mounted display (HMD) (500) to provide a virtual cue, wherein the method (2000) comprises of:
computing initial coordinates of the at least one aerial target (50) using a target observation device (TOD) (100);
processing the initial coordinates to obtain timestamped tactical data of the aerial target (50) at a target data receiver (TDR) (125);
performing cue-to-slew commands at a thermal camera (200) mounted over a stabilized gimbal (250), predicting future position of the aerial target (50) such that the aerial target (50) is centered in frame of the thermal camera (200) and rotating the gimbal (250) towards the predicted aerial target position (50) in real-time;
measuring 6DoF position of the HMD (500) worn by a gunner and a weapon (700) by an array of infrared markers (650) that are embedded in a platform (600), wherein local frame of the platform (600) is aligned with a global frame;
measuring 6DoF position of the weapon (700) aligned with the HMD (500) in the global reference frame; and
rendering the virtual cue of the aerial target (50) over the HMD (500) in the real-time for direct aiming of the aerial target (50) by the weapon (700).
18) The method (2000), as claimed in claim 17, wherein the initial coordinates are computed in a georeferenced frame along with bearing and elevation readings of the aerial target (50).
19) The method (2000), as claimed in claim 17, wherein the timestamped tactical data is obtained using GPS Pulse-Per-Second (PPS) synchronized clock via Precision Time Protocol (PTP).
20) The method (2000), as claimed in claim 17, wherein the initial coordinates are processed for noise elimination and for extrapolation of trajectory for the aerial target (50) using a particle filter approach.
21) The method (2000), as claimed in claim 20, wherein the trajectory of the aerial target (50) is updated based on observations recorded from the thermal camera (200).
22) The method (2000), as claimed in claim 20, wherein the at least one aerial target (50) is selected based on a plurality of factors, including, though not limited to priority and estimation of target, target launch direction, target type, speed and distance of the target, target trajectory, and/or lethality levels.
23) The method (2000), as claimed in claim 17, further comprising performing a geometric parallax correction between spatially separated TOD (100) and the TDR (125) in steps of:
establishing a baseline vector from position of the TDR (125) and TOD (100);
performing TOD (100) target vector transformation from range, azimuth and elevation of the TOD (100); and
performing parallax correction using the baseline vector and the target vector transformation.
24) The method (2000), as claimed in claim 17, wherein Real-Time Kinematics (RTK) pose of the thermal camera (200) is interpolated to timestamp that of the TOD (100) and an optical centre of the thermal camera (200) is computed to determine the absolute position of the aerial target (50) in the global reference frame.
25) The method (2000), as claimed in claim 17, further comprising correcting parallax between the TDR (125) and the thermal camera (200) by applying lever-arm offset and rotation offset for accurate performance of the cue-to-slew command.
26) The method (2000), as claimed in claim 17, wherein instantaneous range of the aerial target (50) is measured in real-time using a laser rangefinder (LRF) (275) optically aligned with the thermal camera (200).
27) The method (2000), as claimed in claim 17, further comprising aligning the platform (600) with east direction such that local frame of the platform (600) is mapped to the global frame.
28) The method (2000), as claimed in claim 27, further comprising expressing the HMD (500) and the weapon (700) in the global reference frame by applying translation obtained from position of the gunner and rotation from the east aligned platform (600).
29) The method (2000), as claimed in claim 17, wherein the 6DoF pose of the HMD (500) is measured from the platform’s (600) reference frame using an IR tracker module (550) and employing perspective-n-point (PnP) approach.
30) The method (2000), as claimed in claim 17, wherein the 6DoF pose of the weapon (700) is measured from the platform’s (600) reference frame using an infrared marker (750) and employing perspective-n-point (PnP) approach
| # | Name | Date |
|---|---|---|
| 1 | 202421072682-PROVISIONAL SPECIFICATION [26-09-2024(online)].pdf | 2024-09-26 |
| 2 | 202421072682-FORM FOR STARTUP [26-09-2024(online)].pdf | 2024-09-26 |
| 3 | 202421072682-FORM FOR SMALL ENTITY(FORM-28) [26-09-2024(online)].pdf | 2024-09-26 |
| 4 | 202421072682-FORM 1 [26-09-2024(online)].pdf | 2024-09-26 |
| 5 | 202421072682-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [26-09-2024(online)].pdf | 2024-09-26 |
| 6 | 202421072682-EVIDENCE FOR REGISTRATION UNDER SSI [26-09-2024(online)].pdf | 2024-09-26 |
| 7 | 202421072682-DRAWINGS [26-09-2024(online)].pdf | 2024-09-26 |
| 8 | 202421072682-DRAWING [15-09-2025(online)].pdf | 2025-09-15 |
| 9 | 202421072682-COMPLETE SPECIFICATION [15-09-2025(online)].pdf | 2025-09-15 |
| 10 | 202421072682-FORM-9 [16-09-2025(online)].pdf | 2025-09-16 |
| 11 | 202421072682-MSME CERTIFICATE [18-09-2025(online)].pdf | 2025-09-18 |
| 12 | 202421072682-FORM28 [18-09-2025(online)].pdf | 2025-09-18 |
| 13 | 202421072682-FORM 18A [18-09-2025(online)].pdf | 2025-09-18 |
| 14 | 202421072682-FORM-5 [19-09-2025(online)].pdf | 2025-09-19 |