Sign In to Follow Application
View All Documents & Correspondence

Object Focussed Decision Support

Abstract: The invention relates to a method for decision support of a combat object (1 ) in a combat situation comprising the steps of: a) detecting (3) an enemy object (2) such that a plurality of characteristic parameters of the enemy object (2) is determined b) calculating (4) at least one quality factor for at least one combat sensor of the combat object (1 ) wherein each quality factor is adapted for indicating identification ability of a combat sensor and calculating (4) at least one signature factor for at least one enemy sensor of the enemy object (2) based on a predetermined model wherein each signature factor is adapted for indicating identification ability of an enemy sensor c) allocating (5) each quality factor calculated in the previous step b) to each combat sensor and allocating (5) each signature factor calculated in the previous step b) to each enemy sensor and d) controlling (6) each combat sensor against the enemy object (2) based on the result of the previous step c). In this way support for the pilot on a target oriented basis is provided in order to make a quick and efficient decision in a combat situation.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
03 June 2014
Publication Number
06/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SAAB AB
S 581 88 Linköping

Inventors

1. LUNDQVIST Anders
Edlunda 14 S 185 99 Vaxholm
2. KENSING Vibeke
Löten Solliden S 590 48 Vikingstad

Specification

OBJECT-FOCUSSED DECISION SUPPORT
Field of the invention
The invention relates to a method for decision support of a combat object in a combat
situation.
Background of the invention
Document US 7,525,448 B 1 describes an aircraft optical display system for implementing an
enhanced vision system based on weather conditions. The display system includes a
plurality of imaging sensors configured to receive imaging input data and generate image
data, where each imaging sensor is associated with one or more weather conditions.
Highly developed functions for human machine interface, HMI for short, and decision support
as support functions for the pilot environment in combat aircraft do already exist. All solutions
are based on combat situations where HMI and decision support together describe a current
position and display the tools and solutions to the pilot.
Existing solutions are based on the aircraft itself and its available resources and tools.
Sensors, such as radar, are operated by the pilot as the tool for close-range scanning or for
scanning objects for identification and continued pursuit. Decision support supports the
multiple use of sensors by merging objects detected by several different sensors and
coordinating and correlating these objects in a situation picture.
However, when the complexity increases because more tools and sensors are supplied, the
possibilities available to the pilot to control his tools/sensors in time are limited and made
difficult. In time-critical situations, for instance in air combat, the pilot risks becoming the
underdog in combat because of this. Another limitation is the fact that each tool and/or
sensor has its own characteristics and peculiarities. Each sensor and/or tool therefore
requires its own interface and control functions which the pilot needs to be able to
understand and use correctly.
Summary of the invention
It is the object of the invention to provide a possibility for assisting a pilot on a target-oriented
basis in decision support in a combat situation.
This object is achieved by the subject matter of independent claim . Preferred embodiments
are defined in the sub claims.
According to an aspect of the invention, this object is achieved by a method for decision
support of a combat object in a combat situation comprising the steps of: a) detecting an
enemy object such that a plurality of characteristic parameters of the enemy object is
determined, b) calculating at least one quality factor for at least one combat sensor of the
combat object, wherein each quality factor is adapted for indicating identification ability of a
combat sensor, and calculating at least one signature factor for at least one enemy sensor of
the enemy object based on a predetermined model, wherein each signature factor is adapted
for indicating identification ability of an enemy sensor, c) allocating each quality factor
calculated in the previous step b) to each combat sensor and allocating each signature factor
calculated in the previous step b) to each enemy sensor, and d) controlling each combat
sensor against the enemy object based on the result of the previous step c). Preferably,
identification ability of the combat sensor comprises detection ability of the combat sensor
and identification ability of the enemy sensor comprises detection ability of the enemy sensor,
respectively.
It is an idea of the invention that based on the knowledge of the different, previously
calculated and allocated, signature factors of the enemy sensors and different quality factors
of the own combat sensors, the positions of the own combat object and of the enemy object
are determined. It is not necessary to go for the optimum in the controlling step d) since a
trade-off between increasing the quality factor and decreasing the signature factor is already
adequate, wherein both factors are independent from each other. According to other
preferred embodiments of the invention the optimum is searched.
It is a further idea of the invention to use the radar principle with main and side lobes such
that one can determine strong and weak points in the system. In order to serve as a decision
support tool the aspect angle of the combat sensor and/or the emission control is/are
changed such that the mode in the combat aircraft is adjustable. Preferably, the results are
integrated over time. In this way, a matrix of predefined lists is obtained, wherein the
combinations can be used in order to get discrete decisions and their number corresponds to
a predefined number of possibilities. Hence, the sensors of the combat aircraft are not
controlled by the pilot but on the basis of the expected enemy aircraft. Two parameters,
quality Q and signature S, are introduced for sensor control. Q refers to sensor quality, in
particular to the own sensor quality, when detecting an enemy object and S refers to the
signature, in particular to the own signature, exposed to the same enemy object and its
sensors, wherein the sensors can be assumed. It is thus an idea of the invention to provide a
decision support system which evaluates detected and assumed objects in the situation
picture and adapts the sensors of a pilot's own aircraft to these objects on the basis of Q and
S. The assumptions are typically based on the current reports for the area or from
expectation based on typical behaviour and doctrine. The purpose is to shift the focus to the
detected and measured objects in order to perform the tasks needed on a target-oriented
basis and not by micro-handling tools and/or sensors.
According to a preferred embodiment of the invention, the combat object comprises a
combat aircraft and/or a combat station and the enemy object comprises at least one of an
enemy combat aircraft, an enemy station and an obstacle, such as a mountain or a cloud.
The plurality of characteristic parameters of the enemy object preferably comprise type,
position, velocity and/or aspect angle. Preferably, the predetermined model in step b)
comprises the characteristics of the at least one enemy sensor, an atmospheric model and/or
a condition model. The atmospheric model preferably comprises a plurality of atmospheric
parameters such as wind speed, rain, humidity, fog and/or clouds. The condition model is
preferably frequency dependent and comprises at least one of a visual and an infrared
frequency spectrum.
According to a preferred embodiment of the invention, the method further comprises the step
of storing the at least one quality factor for the at least one combat sensor and storing the at
least one signature factor for the at least one enemy sensor.
According to a preferred embodiment of the invention, the method further comprises the step
of displaying the at least one quality factor for the at least one combat sensor and displaying
the at least one signature factor for the at least one enemy sensor.
According to a preferred embodiment of the invention, the method further comprises the step
of recording each quality factor out of a plurality of quality factors and each signature factor
out of a plurality of signature factors, wherein the recorded data is adapted for generating a
situation picture which is adapted for decision support of the combat object in the combat
situation. The controlling step d) is preferably adapted for decision support of the combat
object such that the combat object adjusts its appearance in the combat situation.
Brief description of the drawings
These and other aspects of the invention will be apparent from and elucidated with reference
to the embodiments described hereinafter.
In the drawings:
Fig. 1 illustrates the steps of a method for decision support of a combat object in a
combat situation according to a preferred embodiment of the invention; and
Fig. 2 schematically shows the decision support system controlling the sensors on
the basis of an analysis of the objects which are to be detected according to
another preferred embodiment of the invention.
Detailed description of embodiments
Fig. 1 shows the steps of a method for decision support of a combat object 1 according to a
preferred embodiment of the invention. Firstly, an enemy object 2 is detected 3 such that a
few characteristic parameters of the enemy object 2 are determined. Secondly, at least one
quality factor for at least one combat sensor of the combat object 1 is calculated 4 , wherein
each quality factor is adapted for indicating identification ability of a combat sensor, and
further at least one signature factor for at least one enemy sensor of the enemy object 2 is
calculated 4 based on a predetermined model, wherein each signature factor is adapted for
indicating identification ability of an enemy sensor. Thirdly, each quality factor calculated in
the previous step is allocated 5 to each combat sensor and each signature factor calculated
in the previous step is allocated 5 to each enemy sensor, and, finally, each combat sensor is
controlled 6 against the enemy object 2 based on the result of the previous step. In addition,
the method can further comprise the steps of storing 7 the at least one quality factor for the
at least one combat sensor and storing 7 the at least one signature factor for the at least one
enemy sensor and further displaying 8 the quality factor and the at least one signature factor.
The method can comprise the further step of recording 9 each quality factor and each
signature factor, wherein the recorded data is adapted for generating a situation picture
which is adapted for decision support of the combat object 1 in the combat situation.
Fig. 2 shows a decision support system controlling the sensors on the basis of an analysis of
the enemy objects that have been detected according to another preferred embodiment of
the invention. Two incorporated parameters are assessed for each combat sensor. Q
describes the ability of the combat sensor to detect, pursue or identify an enemy object
based on the quality indication for the current situation. S describes the revealing and/or
identification ability of the combat sensor relative to an indicated enemy object in the current
situation and is indicated as "S" in Fig. 2. According to this preferred embodiment of the
invention, S comprises active emission and passive reflection of active signals, such as
those from radar. A current situation refers to the mutual spatial positions and vectors, such
as position, height and speed, and climatic zone or atmosphere conditions for the combat
aircraft or enemy object as well as any obstacles in the topography, such as intervening
mountains and hills that can be displayed in the visual and/or IR frequency spectrum, for
instance, or weather and/or topography situations, such as clouds or forests. The current
situation is described by models, such as the atmospheric model, the condition model for
one's own aircraft or the condition model for the enemy aircraft. In this way, Q and S are
coordinated in the decision support system which then generates a plurality of control orders
that are sent to the respective sensors. According to other preferred embodiments of the
invention, the model comprises a jamming situation, for instance enemy radar jamming.
Therefore, the pilot's focus is shifted from handling sensors to tactically working with objects
in the situation picture which makes the pilot object-focused instead of tool-focused. In this
way, sensors can be controlled automatically so that objects can be detected to the optimum
degree without revealing the combat aircraft.
While the invention has been illustrated and described in detail in the drawings and foregoing
description, such illustration and description are to be considered illustrative or exemplary
and not restrictive and it is not intended to limit the invention to the disclosed embodiments.
The mere fact that certain measures are recited in mutually different dependent claims does
not indicate that a combination of these measures cannot be used advantageously.
Claims
1. A method for decision support of a combat object ( 1) in a combat situation comprising
the steps of:
a) detecting (3) an enemy object (2) such that a plurality of characteristic
parameters of the enemy object (2) is determined,
b) calculating (4) at least one quality factor for at least one combat sensor of the
combat object (1), wherein each quality factor is adapted for indicating identification ability of
a combat sensor, and calculating (4) at least one signature factor for at least one enemy
sensor of the enemy object (2) based on a predetermined model, wherein each signature
factor is adapted for indicating identification ability of an enemy sensor,
c) allocating (5) each quality factor calculated in the previous step b) to each
combat sensor and allocating (5) each signature factor calculated in the previous step b) to
each enemy sensor, and
d) controlling (6) each combat sensor against the enemy object (2) based on the
result of the previous step c).
2. The method according to claim 1, wherein the combat object ( 1 ) comprises a combat
aircraft and/or a combat station and wherein the enemy object (2) comprises at least one of
an enemy combat aircraft, an enemy station and an obstacle, such as a mountain or a cloud.
3. The method according to one of the preceding claims, wherein the plurality of
characteristic parameters of the enemy object (2) comprise type, position, velocity and/or
aspect angle.
4. The method according to one of the preceding claims, wherein the predetermined
model in step b) comprises the characteristics of the at least one enemy sensor, an
atmospheric model and/or a condition model.
5. The method according to claim 4, wherein the atmospheric model comprises a
plurality of atmospheric parameters such as wind speed, rain, humidity, fog and/or clouds.
6. The method according to claim 4, wherein the condition model is frequency
dependent and comprises at least one of a visual and an infrared frequency spectrum.
7. The method according to one of the preceding claims, further comprising the step of
storing (7) the at least one quality factor for the at least one combat sensor and storing (7)
the at least one signature factor for the at least one enemy sensor.
8. The method according to one of the preceding claims, further comprising the step of
displaying (8) the at least one quality factor for the at least one combat sensor and displaying
(8) the at least one signature factor for the at least one enemy sensor.
9. The method according to one of the preceding claims, further comprising the step of
recording (9) each quality factor out of a plurality of quality factors and each signature factor
out of a plurality of signature factors, wherein the recorded data is adapted for generating a
situation picture which is adapted for decision support of the combat object ( 1) in the combat
situation.
10. The method according to one of the preceding claims, wherein the controlling step
(step d)) is adapted for decision support of the combat object (1) such that the combat object
( ) adjusts its appearance in the combat situation.

Documents

Application Documents

# Name Date
1 4468-delnp-2014-Correspondence Others-(26-05-2015).pdf 2015-05-26
1 FORM 5.pdf 2014-06-09
2 4468-delnp-2014-Form-3-(26-05-2015).pdf 2015-05-26
2 FORM 3.pdf 2014-06-09
3 Drawing.pdf 2014-06-09
3 4468-delnp-2014-Correspondance Others-(20-02-2015).pdf 2015-02-20
4 Complete Specification.pdf 2014-06-09
4 4468-delnp-2014-Form-3-(20-02-2015).pdf 2015-02-20
5 Abstract.pdf 2014-06-09
5 4468-DELNP-2014-Correspondence Others-(05-12-2014).pdf 2014-12-05
6 4468-DELNP-2014-Form-3-(05-12-2014).pdf 2014-12-05
6 4468-DELNP-2014-Correspondence-Others-(27-06-2014).pdf 2014-06-27
7 4468-DELNP-2014-GPA-(09-07-2014).pdf 2014-07-09
7 4468-delnp-2014-Correspondence-Others-(21-08-2014).pdf 2014-08-21
8 4468-delnp-2014-GPA-(21-08-2014).pdf 2014-08-21
8 4468-DELNP-2014-Correspondence-Others-(09-07-2014).pdf 2014-07-09
9 4468-delnp-2014-Correspondence-Others-(08-08-2014).pdf 2014-08-08
9 4468-DELNP-2014.pdf 2014-07-10
10 4468-delnp-2014-Form-3-(08-08-2014).pdf 2014-08-08
11 4468-delnp-2014-Correspondence-Others-(08-08-2014).pdf 2014-08-08
11 4468-DELNP-2014.pdf 2014-07-10
12 4468-DELNP-2014-Correspondence-Others-(09-07-2014).pdf 2014-07-09
12 4468-delnp-2014-GPA-(21-08-2014).pdf 2014-08-21
13 4468-delnp-2014-Correspondence-Others-(21-08-2014).pdf 2014-08-21
13 4468-DELNP-2014-GPA-(09-07-2014).pdf 2014-07-09
14 4468-DELNP-2014-Correspondence-Others-(27-06-2014).pdf 2014-06-27
14 4468-DELNP-2014-Form-3-(05-12-2014).pdf 2014-12-05
15 4468-DELNP-2014-Correspondence Others-(05-12-2014).pdf 2014-12-05
15 Abstract.pdf 2014-06-09
16 4468-delnp-2014-Form-3-(20-02-2015).pdf 2015-02-20
16 Complete Specification.pdf 2014-06-09
17 4468-delnp-2014-Correspondance Others-(20-02-2015).pdf 2015-02-20
17 Drawing.pdf 2014-06-09
18 4468-delnp-2014-Form-3-(26-05-2015).pdf 2015-05-26
18 FORM 3.pdf 2014-06-09
19 FORM 5.pdf 2014-06-09
19 4468-delnp-2014-Correspondence Others-(26-05-2015).pdf 2015-05-26