Abstract: The present invention relates to a cost effective method of ground based object acquisition and tracking by the computer module from an aircraft using on-board inertial sensor (INS) for mission control operations or reconnaissance in the absence or failure of primary tracking systems. The present invention provides an effective solution in identification of a ground based object by acquisition module and automatic tracking of the object by track module using inertial sensor data. The method involves display of computer generated symbols on cockpit display system known as Head Up Display (HUD). The user designates the ground based static or mobile object by slewing the HUD symbol over the visual target using the control stick. The user designates the object twice at certain time intervals to acquire the angular position of the object. The acquired position is used to compute the range components of the object. The acquisition method uses the object range components and the timer data to compute the speed of the target. The timer data provides the time information during which pilot slews and designates the object. The tracking system uses the object speed components from the acquisition phase and inertial speed components, control stick data, roll, pitch, heading to compute the azimuth and elevation of the object. The azimuth and elevation angles are used to display the object on HUD.
1. Title of the invention
System and Method for in-flight Tracking of Ground Based Object from an Aircraft.
2. Field of the Invention
The present invention relates to Aircraft navigation in particular relates to in-flight tracking of ground based object by using onboard computer module and inertial sensor.
3. Prior art and Draw backs of prior art
On an Aircraft, there is a need to identify the ground objects and track the objects to perform various mission control and navigation operations. The ground objects can be static or mobile type. The ground based objects are tracked using the various onboard sensors and tracking systems like laser pod, radars or optic trackers. These sensors are expensive and integration of these systems is a complex activity as well as in case of failure of any of these tracking sensors, no back-up tracking is not available. Hence when any of the primary tracking system like laser pod, radars fails, the tracking needs to be carried out by pilot manually. The manual tracking is a very complex process which increases the pilot load during flight.
4. Aim of the Invention
The main objective of this invention is to provide an effective ground based object tracking solution by inventing system and method capable of acquiring and tracking a ground based static or mobile object using inertial sensor. This invention can work in standalone mode or can provide redundancy to primary object tracking systems like Laser or RADAR tracker.
5. Summary of the invention:
A system and method is invented to acquire a ground object visually and continuously track it using inertial sensor data. The method is realized in two phases. In the first phase, the acquisition module (Refer figure 1) displays the object symbol (square symbol) on the display system [9] known as HUD (Head Up Display) at fixed depression angle as shown in figure 4.The user visually identifies the moving object on HUD [9] and manually slews the object symbol on HUD over the visual object. The control stick [2] is used to slew the symbol on HUD and the control stick signals are converted to the angles for display on HUD.The user designates the object two times at some time interval and gives command to the acquisition module to memorise the positions of the object. The axis transformation module [8] in figure 1 converts the object azimuth and elevation angles into range distances in body axis and ground axis. The computation module [10] computes the ground speed based on the distance covered by the object in the time interval during which the user designates the visual object.
The second phase of the method involves tracking of the acquired object using inertial speed and target speed computed in acquisition phase. This phase involves computation of the object position using the inertial speed components, pitch, roll, heading from INS sensor [5] and object speed components. The INS sensor [5] provides the velocity components in north, east directions and vertical speed. The inertial speed components are converted to speed components in ground axis. The ground axis range components are used with object speed components to find the new range components of the object. The range components are converted to azmuth, elevation angles for displaying on HUD [9].The control stick is used to fine tune the position of the object being displayed on HUD [9].
6. Detailed description of the invention:
Tracking of ground based object during the flight from an Aircraft is done in two phases. In the first phase object the object is acquired visually and in the second phase it is tracked. The detailed description of the invention is as follows:
6.1 Phase 1: Object Acquisition and Speed Computation:
In this method, a fixed object box (Square box or Diamond Box) is displayed on HUD (Head Up Display) system at fixed depression as shown in Figure 4.The initial object position shall be displayed at different fixed depression based on the Aircraft height above the ground. The pilot needs to fly the Aircraft in order to bring the object in the visible field of view on HUD.The pilot needs to move the object box over the visible object using the control stick and give a trigger command to the acquisition system to memorize the angle at which the trigger is commanded. The module 2 converts the control stick signals to appropriate angular position of the object in terms of azimuth and elevation angles. The module 8 converts the azimuth and elevation angles into the body axis distances using the height above the object (3) and pitch, roll. The module 8 also converts the body axis range components in i, j, k directions into the ground axis distances using sine functions and cosine functions of pitch, roll. Refer figure 3 for the body axis and ground axis representation. The object is designated two times with time interval to acquire the positions of the moving object by control stick triggers. The acquisition system in figure 1 computes ground distances corresponding to the designated object box positions. The timer module 4 keeps track of the time duration between the trigger commands. The module 10 computes the speed components using the ground distances in X and Y directions and the timer module 4 output as per the equation below.
Object X speed = (Range X2 - Range X1) / (TIMER DIFFERENCE)
Object Y speed = (Range Y2 - Range Y1) / (TIMER DIFFERENCE)
The acquisition system shown in figural transits from acquisition phase to track phase following the second trigger command. The ground distances of the object after the second trigger command are memorized by the acquisition systems which are used in the track phase. The acquisition system changes the shape of the object box to object circle to indicate that the object track phase is entered.
6.2 Phase 2: Object Tracking Using Inertial Sensor in Track Phase
In this method, the track module 12 accepts the inertial speed components, pitch, roll, heading from INS (5) along with ground axis range components, target speed components from previous phase. The module 12 converts the inertial speed components from geographic axis to ground axis using the sine functions and cosine functions of pitch, roll. The control stick signals are used to adjust the position of the object circle displayed on the HUD. The speed conversion module 11 converts the control stick X, Y signals into the speed components in X and Y directions as per the slant range from the Aircraft to the object of interest, Aircraft height above the ground, slew voltages, control stick constants Kx, Ky. The control stick constants Kx, Ky are expressed in radians/sec.The slew speed components are used to fine tune the position of the symbol displayed on HUD. The module 12 combines the inertial speed components, object speed components, control stick speed components, heading to compute the distance components in X, Y direction as per equations mentioned below:
Temp Distance = Range X 1
Range X new = Range X -
(Cycle time * (Inertial X speed - Slew X Speed-Object X speed)) +
Sin (Heading Difference) * Range Y 2
Range Y new= Range Y -
(Cycle time * (Inertial Y speed - Slew Y Speed -Object Y speed)) -
Sin (Heading Difference) * Temp Distance) 3
Note:
All the Range Components considered are in Ground axis. Cycle time refers to the processing cycle of the track module. Heading difference corresponds to change in heading with respect to heading in previous cycle time.
Slew speeds are the speed in X and Y directions corresponding to control stick movement
The module 8 converts the ground axis range components into Aircraft body axis range components using the pitch, roll, and heading. The module 13 uses the body axis range components in i, j, k directions to compute the azimuth and elevation angles of the object circle as per the equations mentioned:
tan (azimuth) = Range Component (j direction) / Range Component (I direction) 4
tan (depression) = Range Component (k direction) /Range Component (I direction) 5
Note: Range components in i, j, k directions refer to range components in Aircraft body axis
The azimuth and elevation angles are used to display the position of the object circle. The object circle represents the position of the moving object on ground. The object circle is displayed on the HUD as long as object is ahead of the Aircraft.
7. Brief description of the drawings:
Fig-1: Block Diagram of Acquisition System for Target Speed Computation which consists of acquisition module and other inputs to compute the speed components.
Fig-2: Block Diagram Of Tracking System for Target Position Computation which consists of track and axis transformation module to compute the position of the moving object in terms of azimuth and elevation angles.
Fig-3: Representation of Body Axis and Ground Axis Frame which shows the relation between the ground axis frame and body axis frame.
Fig-4: Object symbol Display on HUD which shows the square box displayed the computer module. This square box is slewed by the user to acquire the visible target on HUD.
CLAIMS We Claim:
1. A system and method for tracking a ground based object from an aircraft consisting of acquisition system & tracking system.
2. Acquisition system as claimed in 1, consisting of control stick module, angel conversion module, timer module, inertial navigation module .acquisition module, axis transformation module, speed computation module, display generation module. Acquisition system computes the object speed.
3. Tracking system as claimed in 1 consisting of control stick module, speed conversion module, inertial navigation module, track module, axis transformation module, azimuth elevation computation module, display generation module. Tracking system computes the object range & object position.
| # | Name | Date |
|---|---|---|
| 1 | 3419-CHE-2013 FORM-2 31-07-2013.pdf | 2013-07-31 |
| 1 | 3419-CHE-2013-Defence-17-12-2024.pdf | 2024-12-17 |
| 1 | 3419-CHE-2013-FER REPLY-220724.pdf | 2024-07-24 |
| 2 | 3419-CHE-2013-FER.pdf | 2024-01-24 |
| 2 | 3419-CHE-2013-FER REPLY-220724.pdf | 2024-07-24 |
| 2 | 3419-CHE-2013 FORM-1 31-07-2013.pdf | 2013-07-31 |
| 3 | 3419-CHE-2013 FORM-5 31-07-2013.pdf | 2013-07-31 |
| 3 | 3419-CHE-2013 FORM-18 18-08-2014.pdf | 2014-08-18 |
| 3 | 3419-CHE-2013-FER.pdf | 2024-01-24 |
| 4 | 3419-CHE-2013 ABSTRACT 31-07-2013.pdf | 2013-07-31 |
| 4 | 3419-CHE-2013 FORM-18 18-08-2014.pdf | 2014-08-18 |
| 4 | 3419-CHE-2013 FORM-3 31-07-2013.pdf | 2013-07-31 |
| 5 | 3419-CHE-2013 ABSTRACT 31-07-2013.pdf | 2013-07-31 |
| 5 | 3419-CHE-2013 CLAIMS 31-07-2013.pdf | 2013-07-31 |
| 5 | 3419-CHE-2013 DRAWINGS 31-07-2013.pdf | 2013-07-31 |
| 6 | 3419-CHE-2013 CLAIMS 31-07-2013.pdf | 2013-07-31 |
| 6 | 3419-CHE-2013 CORRESPONDENCE OTHERS 31-07-2013.pdf | 2013-07-31 |
| 6 | 3419-CHE-2013 DESCRIPTION (COMPLETE) 31-07-2013.pdf | 2013-07-31 |
| 7 | 3419-CHE-2013 CORRESPONDENCE OTHERS 31-07-2013.pdf | 2013-07-31 |
| 7 | 3419-CHE-2013 DESCRIPTION (COMPLETE) 31-07-2013.pdf | 2013-07-31 |
| 8 | 3419-CHE-2013 CLAIMS 31-07-2013.pdf | 2013-07-31 |
| 8 | 3419-CHE-2013 DESCRIPTION (COMPLETE) 31-07-2013.pdf | 2013-07-31 |
| 8 | 3419-CHE-2013 DRAWINGS 31-07-2013.pdf | 2013-07-31 |
| 9 | 3419-CHE-2013 ABSTRACT 31-07-2013.pdf | 2013-07-31 |
| 9 | 3419-CHE-2013 DRAWINGS 31-07-2013.pdf | 2013-07-31 |
| 9 | 3419-CHE-2013 FORM-3 31-07-2013.pdf | 2013-07-31 |
| 10 | 3419-CHE-2013 FORM-18 18-08-2014.pdf | 2014-08-18 |
| 10 | 3419-CHE-2013 FORM-3 31-07-2013.pdf | 2013-07-31 |
| 10 | 3419-CHE-2013 FORM-5 31-07-2013.pdf | 2013-07-31 |
| 11 | 3419-CHE-2013 FORM-5 31-07-2013.pdf | 2013-07-31 |
| 11 | 3419-CHE-2013 FORM-1 31-07-2013.pdf | 2013-07-31 |
| 11 | 3419-CHE-2013-FER.pdf | 2024-01-24 |
| 12 | 3419-CHE-2013 FORM-1 31-07-2013.pdf | 2013-07-31 |
| 12 | 3419-CHE-2013 FORM-2 31-07-2013.pdf | 2013-07-31 |
| 12 | 3419-CHE-2013-FER REPLY-220724.pdf | 2024-07-24 |
| 13 | 3419-CHE-2013 FORM-2 31-07-2013.pdf | 2013-07-31 |
| 13 | 3419-CHE-2013-Defence-17-12-2024.pdf | 2024-12-17 |
| 14 | Reply from Defence.pdf | 2025-06-03 |
| 15 | 3419-CHE-2013-US(14)-HearingNotice-(HearingDate-12-09-2025).pdf | 2025-08-08 |
| 16 | 3419-CHE-2013-Hearing Written Submission-260925.pdf | 2025-10-08 |
| 17 | 3419-CHE-2013-PatentCertificate15-10-2025.pdf | 2025-10-15 |
| 18 | 3419-CHE-2013-IntimationOfGrant15-10-2025.pdf | 2025-10-15 |
| 19 | 3419-CHE-2013-OTHERS-260925.pdf | 2025-10-22 |
| 20 | 3419-CHE-2013-Form 13-260925.pdf | 2025-10-22 |
| 1 | 3419_CHE_2013SearchstratgyE_18-01-2024.pdf |
| 1 | Newsearchstratgy3419_26-07-2018.pdf |
| 2 | 3419_CHE_2013SearchstratgyE_18-01-2024.pdf |
| 2 | Newsearchstratgy3419_26-07-2018.pdf |