Sign In to Follow Application
View All Documents & Correspondence

Airborne Optoelectronic Equipment For Imaging Monitoring And/Or Designating Targets

Abstract: A piece of airborne optoelectronic equipment (EO, EO1, EO2) comprising: - at least one image sensor (CI1, CI2), designed to acquire a plurality of images (IMR1, IMR2) of a region (RS) flown over by a carrier (PO1, PO2) of said equipment; and - a data processor (PD) configured or programmed to receive at least one so-called acquired image and transmit same to a display device (EA); characterised in that said data processor is also configured or programmed to: - access a database (BD) of images of said overflown region; - extract, from said database, information making it possible to synthesise a virtual image (IMV) of said region that is viewed by an observer located at a predefined observation point and watching, with a predefined field of view, along a predefined line of sight; - synthesise said virtual image and; - transmit same to said or to another display device. The invention also relates to a method for using such a piece of equipment.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
10 January 2017
Publication Number
18/2017
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
remfry-sagar@remfry.com
Parent Application

Applicants

THALES
Tour Carpe Diem Place des Corolles Esplanade Nord F 92400 Courbevoie

Inventors

1. PERRUCHOT Ludovic
Thales Optronique S.A. 2 avenue Gay Lussac F 78995 Elancourt
2. BECHE Arnaud
Thales Optronique S.A. 2 avenue Gay Lussac F 78995 Elancourt
3. DEPRUGNEY Fabien
Thales Optronique S.A. 2 avenue Gay Lussac F 78995 Elancourt
4. RABAULT Denis
Thales Optronique S.A. 2 avenue Gay Lussac F 78995 Elancourt
5. DEPARDON Bruno
Thales Optronique S.A. 2 avenue Gay Lussac F 78995 Elancourt

Specification

The invention relates to an airborne optronic equipment item,
5 that can be called "permanent vision equipment item", for imaging, monitoring
and/or designating targets such as, for example, a laser designation "pod".
The invention relates also to a method implemented by means of such an
optronic equipment item.
An optronic equipment item for imaging, monitoring and/or
1 o designating targets with which a carrier (generally an aircraft) is equipped
comprises one or more image sensors which make it possible to acquire
images of a region flown over by the carrier. At least one of these images is
displayed on a screen to allow the operator to perform various types of
missions: reconnaissance 2nd monitoring to analyze an area and seek and
15 identify particular elements; attack for positioning and designating surface or
airborne targets; or navigation by facilitating flight in difficult conditions, at
night or in poor weather ("FUR", or "Forward-Looking InfraRed" mode). If said
equipment item is a designation "pod", it also comprises a laser source and a
beam forming system suitable for directing a laser beam to a target previously
20 identified by an operator on an image acquired by said sensor and displayed
by one said screen.
These equipment items, known from the prior art, exhibit a
certain number of drawbacks:
• in detection/designation mode, the field of the image is very small,
2 5 which makes it difficult for the operator to establish the link between the
image supplied by the optronic equipment item and what he or she
themself sees on the ground (the term "straw effect" is used, because it
is as if the operator were looking through a straw);
• the visibility can be compromised by the weather conditions (clouds,
30 fog, etc.) or the presence of smoke, even because of a masking by the
carrier or the body of the optronic equipment item itself;
II 2
• infrared imaging exhibits a relatively low resolution and monochrome
images, whereas color can be an important piece of information, for
example for recognizing a target;
• the FUR and detection modes can be mutually exclusive;
5 • collaborative missions can demand the display of images acquired by
another optronic equipment item, embedded on another carrier; that
requires a high bit rate data link for the transmission of the data, which
is not always available.
The invention aims to overcome at least some of these
10 drawbacks. For this, it proposes using, in addition to "real" image sensors,
what can be qualified as a "virtual sensor". The latter comprises data
processing means cooperating with a geolocated terrain database to generate
"virtual" images intended to accompany, enrich or replace the "real" images
acquired by the sensors.
15 The data processing means of the "virtual sensor" can be
purely software. In this case, it will involve one or more software modules
intended to be executed by a data processor which also ensures the other
functionalities of the optronic equipment item. They can be purely hardware: in
this case, one or more dedicated - preferably digital - circuits will be involved.
20 Finally, they can be hybrid, combining software modules and dedicated
circuits. The database can be local, in which case it is stored in a mass
memory (for example a solid-state drive) located on the carrier or in the
optronic equipment item, or be remotely accessible via a radio link.
A subject of the invention is therefore an airborne optronic
25 equipment item comprising:
at least one image sensor, suitable for acquiring a plurality
of images of a region flown over by a carrier of said equipment item; and
a data processor configured or programmed to receive at
least one said acquired image and transmit it to a display device;
3 o characterized in that said data processor is also configured or
programmed to:
access a database of images of said region flown over;
II
5
3
extract from said database information making it possible
to synthesize a virtual image of said region which would be seen by an
observer situated at a predefined observation point and looking, with a
predefined field of view, along a predefined line of sight;
synthesize said virtual image; and
transmit it to said or to another display device.
According to different embodiments of such an optronic
equipment item:
Said database can comprise at least: a numerical model
1 o of the terrain of said region; and a plurality of ortho-rectified air or satellite
images or SAR of said region, said images being geolocated; said data
processor being configured or programmed to synthesize said virtual image by
projection of one or more of said air or satellite images onto said numerical
model of the terrain.
15 Said database can also comprise vector mapping data,
said data processor being configured or programmed to incorporate some of
said data in said virtual image.
Said data processor can be configured or programmed to
enrich said database with images acquired by said or at least one said image
20 sensor.
Said data processor can be configured or programmed to
receive, from a geolocation device, information on the position of said carrier
of the equipment item or of another carrier, as well as information indicative of
a line of sight of an image sensor embedded on this carrier, and to synthesize
2 5 a virtual image corresponding to said line of sight and to an observation point
having the same position as said carrier. More particularly, said data
processor can be configured or programmed to display said virtual image in
place of an image acquired by said embedded image sensor in case of
masking or insufficient visibility. As a variant or in addition, said data
30 processor can be configured or programmed to merge said virtual image and
an image acquired by said embedded image sensor with a same line of sight
and a same field of view. Also as a variant or in addition, said data processor
II 4
can be configured or programmed to synthesize one said virtual image, having
a same observation point and a same line of sight as an image acquired by
said embedded image sensor, but a wider field of view, and to insert said
image acquired by said embedded image sensor in said virtual image.
5 Similarly, said data processor can be configured or programmed to synthesize
a plurality of said virtual images corresponding to points of view close to the
position of an image sensor embedded on said carrier, as determined by said
geolocation device, and to recompute said position by correlation between an
image acquired by said sensor and said virtual images. Said data processor
10 can also be configured or programmed to: receive, from said or at least one
said image sensor, embedded on said carrier of the equipment item, at least
one image of said region flown over by a carrier of said equipment item, and
display it on a first display device embedded on the same carrier; receive,
from another carrier, information on the position of said carrier, as well as on
15 the line of sight and the field of view of at least one image sensor embedded
on said other carrier; synthesize a virtual image corresponding to said line of
sight and to an observation point having said position, and display it on a
second display device distinct from said first display device and embedded on
said carrier of the equipment item.
2 o The optronic equipment item can also comprise an
embedded data storage device in which said database is stored.
Said data processor can be configured or programmed to
drive said or at least one said image sensor for it to acquire at least one said
image of said region flown over according to a line of sight and with a field of
25 view that are defined.
Said optronic equipment item can be an airborne optronic
equipment item for designating targets.
Another subject of the invention is a method implemented by
an optronic equipment item as claimed in one of the preceding claims,
30 comprising the following steps:
receiving, from a geolocation device, information on the
position of the carrier of the equipment item or of another carrier, as well as
ll 5
information indicative of a line of sight of an image sensor embedded on this
carrier;
accessing a database of images of said region flown over
and extracting therefrom information making it possible to synthesize a virtual
5 image corresponding to said line of sight and to an observation point having
the same position as said carrier;
synthesizing said virtual image; and
transmitting it to a display device.
Other features, details and advantages of the invention will
10 become apparent on reading the description given with reference to the
attached drawings given by way of example and which represent,
respectively:
figure 1, two fighter airplanes flying over a region,
communicating by a radio link and each carrying an optronic equipment item
15 according to an embodiment of the invention;
figure 2, a functional diagram of an optronic equipment
item according to an embodiment of the invention;
figure 3, the use of an optronic equipment item according
to an embodiment of the invention to display, alternately, a real image or a
2 o virtual image;
figure 4, the use of an optronic equipment item according
to an embodiment of the invention for displaying a real image and a virtual
image that are merged;
figure 5, the use of an optronic equipment item according
25 to an embodiment of the invention for displaying a real image inserted into a
virtual image with a wider field of view;
figure 6, the use of an optronic equipment item according
to an embodiment of the invention for simultaneously displaying a real image
and a virtual image corresponding to a different observation point in the
30 context of a cooperative mission; and
6
figure 7, the use of an optronic equipment item according
to an embodiment of the invention for performing a carrier position correction
operation by image correlation.
Figure 1 illustrates a context of use of an optronic equipment
5 item according to the invention. It represents two fighter airplanes (carriers) P1
'
and P2, each equipped with an optronic equipment item E01, E02 according
to an embodiment of the invention. These equipment items comprise image
sensors observing a region RS flown over by the carriers with respective fields
of view CV1, CV2. The two carriers - and, if necessary, their optronic
1 o equipment items - communicate via a data radio link LR, enabling them to
perform a collaborative mission.
Figure 2 shows a functional diagram of an optronic equipment
item EO according to an embodiment of the invention, or of just its "imaging"
part (the target designation means, which may be present, are not
15 represented). Conventionally, this equipment item comprises three main parts:
One or more image sensors, for example a camera
operating in the visible part of the spectrum, Cl1, and an infrared camera Cl2.
The references IMR1 and IMR2 indicate the images (called "real images"
hereinbelow) acquired by these sensors, or, to be more precise, the digital
2 o data representing these images, conveyed by electronic signals.
A human-machine interface IHM, comprising one or more
display screens EA and/or other display devices such as head-up visors,
allowing an operator to view images, as well as control means MC (buttons,
keyboards, touchscreens, etc.) enabling said operator to enter commands and
25 operating parameters of the equipment item. For example, the control means
MC can allow the operator to select an image sensor, its orientation and its
field of view, and the screen EA displays in real time the images acquired by
this sensor.
A data processor PO, comprising one or more computers
30 and/or dedicated electronic circuits. The data processor drives actuators
ensuring the orientation, the focusing and the setting of the image sensors
Cl1, Cl2 in accordance with the commands entered by the operator: it
II 7
receives the images IMR1, IMR2 acquired by these sensors, if necessary
performs various processes on these images and ensures the display thereof
by the screen or the screens EA.
Still conventionally, the optronic equipment item EO also
5 comprises a geolocation unit UGL, of AHRS (Attitude and heading reference
system) type making it possible to determine the position of the carrier and the
precise position of the line of sight, possibly exploiting the data from a GNSS
(Global Navigation Satellite System) system and/or the inertial data originating
from the unit of the carrier, and a communication device TxRx making it
1 o possible to transmit and receive data via the radio link LR. In a variant, the
geolocation unit and/or the communication device can be external to the
optronic equipment item, and configured to communicate therewith.
The optronic equipment item EO also comprises a virtual
sensor which, in the embodiment of figure 2, consists of a database BD stored
15 in an embedded mass memory and by a software module executed by the
data processor PD. As mentioned above, other embodiments can be
envisaged: for example, the database can be accessible remotely instead of
being embedded and the software module can be replaced wholly or partly by
dedicated electronic circuits forming part of the data processor.
20 The database BD contains a numerical model of the terrain of
25
the region RS flown over by the carrier, typically of DTED type, and a plurality
of geolocated images of said region. The images can have different origins;
they can in particular be:
ortho-rectified satellite images;
ortho-rectified multispectral air reconnaissance images;
images acquired previously by the optronic equipment
item itself, or by other airborne optronic equipment items;
SAR (synthetic aperture radar) images.
The database can also contain geographic vector data,
30 generally of VMAP type: road and rail network, hydrological system, place
names, etc.
11
5
8
It is important to note that the optronic equipment item can in
real time enrich the database with the images that it acquires during each
mission. Thus, it will be possible to ensure the "freshness" of the data stored
in the base.
The software module receives as input the following
information:
a position, which can be the position of the carrier
determined by the geolocation unit UGL, the position of another carrier,
received via the communication device TxRx, or an arbitrary position;
1 o a line of sight, which can be coli near to that of one of the
"real" sensors of the optronic equipment item - or to that of a sensor of such
an equipment item of another carrier, or else be controlled by the pilot or by an
external setpoint;
a desired field of view, which can correspond to that of
15 one of the "real" sensors of the optronic equipment item - or to that of a sensor
of such an equipment item of another carrier, or else be arbitrary; and
optionally, a list of the geographic information to be
displayed (names of roads, places, etc.).
From this information and information (numerical model,
20 images) stored in the database, the software module generates a virtual
image IMV, which corresponds to the image which would be acquired by a
real sensor having the position, orientation (line of sight) and the field of view
desired. Typically, the virtual image is generated or synthesized by projection
of one or more of the images from the database onto said numerical model of
2 5 the terrain. The computer techniques that allow for the synthesis of such a
virtual image are well known to those skilled in the art.
30
It is possible to envisage several different uses of the virtual
image thus obtained. Some of them will be described hereinbelow, with
reference to figures 3 to 7.
As illustrated in figure 3, the data processor PD can select, for
display on the screen EA, either a real image IMR, or a virtual image IMV
corresponding to the same observation point (or point of view), to the same
II 9
line of sight and to the same field of view. The choice of the image to be
displayed can be made by the operator, or automatically by the data
processor, for example if there is masking of the real image.
As illustrated in figure 4, the data processor PO can merge a
5 real image IMR and a virtual image IMV corresponding to the same
observation point, to the same line of sight and to the same field of view to
create an enriched image, in accordance with the "augmented reality", which
is displayed on the screen EA. In the example of figure 4, the virtual image
contains color information (represented, in the figure, by shadings) which is
10 absent from the real image IMR, as well as geographic information (height of
two mountains); on the other hand, only the real image IMR makes it possible
to view an ephemeral phenomenon, in this case the presence of a cloud. The
enriched image makes it possible to display all this information at the same
time.
15 A real image IMR with small field of view can also be inserted
into a virtual image IMV with wide field of view to avoid the abovementioned
"straw effect". This situation is illustrated in figure 5 where the real image IMR,
showing a building in an urban environment, is inserted into a virtual image
IMV with wider field of view in order to be placed in its context (road lanes,
20 other buildings serving as references, etc.). Obviously, the virtual image could
also display geographic data making it easier to identify the building - which
could, in a military application, be a target to be designated. If necessary, the
image with small field of view inserted into the virtual image with wider field of
view could be an enriched image, obtained by merging of a real image and of
25 a virtual image (see figure 4).
Figure 6 shows an application in which a real image IMR,
acquired by a sensor Cl of the optronic equipment item, is displayed on a first
screen EA 1. At the same time, a virtual image IMV is displayed on a second
screen EA2 or on the same screen by switching or by insertion; this virtual
30 image corresponds to the observation point, to the line of sight and to the field
of view of an image sensor of the optronic equipment item of another carrier (if
necessary, it can even be a virtual sensor, this variant being able to be used in
II 10
particular for training purposes), these data being received by the
communication device TxRx. In return, the communication device could be
used to transmit data of the same type to an optronic equipment item
embedded on said other carrier. In the context of the cooperative mission of
5 figure 1, this application allows the pilot of the airplane P1 to see what the pilot
of P2 sees, and vice-versa. It will be noted that, contrary to the prior art, that
does not require a high bit rate link. In effect, it is not necessary to transmit
images from one carrier to another, but only "contextual data" (position, line of
sight,. setting parameters of the image sensor or sensors) allowing for the
10 synthesis of a virtual image.
The optronic equipment item can also be used to refine or
correct a position determined by a geolocation unit UGL. As illustrated in
figure 7, the geolocation unit determines a first estimation of the position of an
image sensor embedded on said sensor, and the data processor PD
15 synthesizes a plurality of virtual images IMV1, IMV2, Vl3 ... IMVN
corresponding to observation points close to this estimated position (that is to
say surrounding said position and situated within a radius defined around it)
and with a defined line of sight. Said image sensor acquires a real image IMR
with the same line of sight. Then, the data processor determines a new
2 o position estimation by correlation between the real image and the virtual
images.
25
Other applications of the optronic equipment item according to
the invention will be able to be envisaged without departing from the scope of
the present invention.

CLAIMS
1. An airborne optronic equipment item (EO, E01, E02)
comprising:
5 at least one image sensor (CI1, Cl2), suitable for acquiring
a plurality of images (IMR1, IMR2) of a region (RS) flown over by a carrier
(P01, P02) of said equipment item; and
a data processor (PD) configured or programmed to
receive at least one said acquired image and transmit it to a display device
10 (EA);
15
20
25
. 30
characterized in that said data processor is also configured or
programmed to:
access a database (BD) of images of said region flown
over;
extract from said database information making it possible
to synthesize a virtual image (IMV) of said region which would be seen by an
observer situated at a predefined observation point and looking, with a
predefined field of view, along a predefined line of sight;
synthesize said virtual image; and
transmit it to said or to another display device.
2. The airborne optronic equipment item in which said
database comprises at least:
a numerical model of the terrain of said region; and
a plurality of ortho-rectified air or satellite images or SAR
of said region, said images being geolocated;
and in which said data processor is configured or programmed
to synthesize said virtual image by projection of one or more of said air or
satellite images onto said numerical model of the terrain .
3. The airborne optronic equipment item as claimed in one of
the preceding claims, in which said database also comprises vector mapping
[I 12
data, and in which said data processor is configured or programmed to
incorporate some of said data in said virtual image.
4. The airborne optronic equipment item as claimed in one of
5 the preceding claims, in which said data processor is configured or
programmed to enrich said database with images acquired by said or at least
one said image sensor.
5. The airborne optronic equipment item as claimed in one of
1 o the preceding claims, in which said data processor is configured or
programmed to receive, from a geolocation device (UGL), information on the
position of said carrier of the equipment item or of another carrier, as well as
information indicative of a line of sight of an image sensor embedded on this
carrier, and to synthesize a virtual image corresponding to said line of sight
15 and to an observation point having the same position as said carrier.
6. The airborne optronic equipment item as claimed in claim
5, in which said data processor is configured or programmed to display said
virtual iniage in place of an image acquired by said embedded image sensor
20 in case of masking or insufficient visibility.
7. · The airborne optronic equipment item as claimed in one of
claims 5 or 6, in which said data processor is configured or programmed to
merge said virtual image and an image acquired by said embedded image
2 5 sensor with a same line of sight and a same field of view.
8. The airborne optronic equipment item as claimed in one of
claims 5 to 7, in which said data processor is configured or programmed to
synthesize one said virtual image, having a same observation point and a
30 same line of sight as an image acquired by said embedded image sensor, but
a wider field of view, and to insert said image acquired by said embedded
image sensor in said virtual image.
11 13
9. The airborne optronic equipment item as claimed in one of
claims 5 to 8, in which said data processor is configured or programmed to
synthesize a plurality of said virtual images corresponding to points of view
close to the position of an image sensor embedded on said carrier, as
5 determined by said geolocation device, and to recompute said position by
correlation between an image acquired by said sensor and said virtual
images.
10. The airborne optronic equipment item as claimed in one of
10 claims 5 to 9, in which said data processor is configured or programmed to:
receive, from said or at least one said image sensor,
embedded on said carrier of the equipment item, at least one image of said
region flown over by a carrier of said equipment item, and display it on a first
display device embedded on the same carrier;
15 receive, from another carrier, information on the position
of said carrier, as well as on the line of sight and the field of view of at least
one image sensor embedded on said other carrier;
synthesize a virtual image corresponding to said line of
sight and to an observation point having said position, and display it on a
2 o second display device distinct from said first display device and embedded on
said carrier of the equipment item.
11. The airborne optronic equipment item as claimed in one of
the preceding claims, also comprising an embedded data storage device in
25 which said database is stored.
12. The airborne optronic equipment item as claimed in one of
the preceding claims, in which said data processor is configured or
programmed to drive said or at least one said image sensor for it to acquire at
30 least one said image of said region flown over according to a line of sight and
with a field of view that are defined.
II 14
13. An airborne optronic equipment item for designating
targets as claimed in one of the preceding claims.
14. A method implemented by an optronic equipment item as
5 claimed in one of the preceding claims, comprising the following steps:
10
receiving, from a geolocation device (UGL), information on
the position of the carrier of the equipment item or of another carrier, .as well
as information indicative of a line of sight of an image sensor embedded on
this carrier;
accessing a database (BD) of images of said region flown
over and extracting therefrom information making it possible to synthesize a
virtual image (IMV) corresponding to said line of sight and to an observation
point having the same position as said carrier;
synthesizing said virtual image; and
15 transmitting it to a display device (EA).

Documents

Application Documents

# Name Date
1 Priority Document [10-01-2017(online)].pdf 2017-01-10
2 Form 5 [10-01-2017(online)].pdf 2017-01-10
3 Form 3 [10-01-2017(online)].pdf 2017-01-10
4 Form 1 [10-01-2017(online)].pdf 2017-01-10
5 Drawing [10-01-2017(online)].pdf 2017-01-10
6 Description(Complete) [10-01-2017(online)].pdf_12.pdf 2017-01-10
7 Description(Complete) [10-01-2017(online)].pdf 2017-01-10
8 201717000981.pdf 2017-01-12
9 abstract.jpg 2017-02-01
10 Form 3 [08-04-2017(online)].pdf 2017-04-08
11 Marked Copy [11-04-2017(online)].pdf 2017-04-11
12 Form 13 [11-04-2017(online)].pdf 2017-04-11
13 Description(Complete) [11-04-2017(online)].pdf_235.pdf 2017-04-11
14 Description(Complete) [11-04-2017(online)].pdf 2017-04-11
15 Other Patent Document [13-04-2017(online)].pdf_336.pdf 2017-04-13
16 Other Patent Document [13-04-2017(online)].pdf 2017-04-13
17 Form 26 [13-04-2017(online)].pdf 2017-04-13
18 201717000981-OTHERS-170417.pdf 2017-04-19
19 201717000981-GPA-170417.pdf 2017-04-19
20 201717000981-Correspondence-170417 -.pdf 2017-04-19
21 201717000981-FORM 3 [20-12-2017(online)].pdf 2017-12-20
22 201717000981-FORM 18 [01-06-2018(online)].pdf 2018-06-01
23 201717000981-FORM 3 [22-12-2018(online)].pdf 2018-12-22
24 201717000981-FORM 3 [17-10-2019(online)].pdf 2019-10-17
25 201717000981-FORM 3 [13-07-2020(online)].pdf 2020-07-13
26 201717000981-Verified English translation [29-01-2021(online)].pdf 2021-01-29
27 201717000981-FORM 3 [05-04-2021(online)].pdf 2021-04-05
28 201717000981-Retyped Pages under Rule 14(1) [29-05-2021(online)].pdf 2021-05-29
29 201717000981-OTHERS [29-05-2021(online)].pdf 2021-05-29
30 201717000981-FER_SER_REPLY [29-05-2021(online)].pdf 2021-05-29
31 201717000981-DRAWING [29-05-2021(online)].pdf 2021-05-29
32 201717000981-COMPLETE SPECIFICATION [29-05-2021(online)].pdf 2021-05-29
33 201717000981-ABSTRACT [29-05-2021(online)].pdf 2021-05-29
34 201717000981-2. Marked Copy under Rule 14(2) [29-05-2021(online)].pdf 2021-05-29
35 201717000981-Information under section 8(2) [01-06-2021(online)].pdf 2021-06-01
36 201717000981-FER.pdf 2021-10-17
37 201717000981-FORM 3 [28-12-2021(online)].pdf 2021-12-28
38 201717000981-Defence-27-02-2024.pdf 2024-02-27
39 201717000981-Defence-04-07-2025.pdf 2025-07-04
40 201717000981-Response to office action [11-09-2025(online)].pdf 2025-09-11
41 201717000981-Annexure [11-09-2025(online)].pdf 2025-09-11

Search Strategy

1 Searchstrategy_201717000981E_14-12-2020.pdf