Sign In to Follow Application
View All Documents & Correspondence

Device For Classifying A Light Source

Abstract: The present invention relates to a device for classifying (14) a light source (11), comprising: - a sensor (20) suitable for receiving a light flux emitted by a light source (11), the sensor (20) comprising a plurality of pixels grouped into sets, each set comprising a first pixel and a second pixel adjacent to the first pixel, each first pixel being suitable for generating a first signal relating to a first light flux portion in a first spectral band received by the first pixel, each second pixel being suitable for generating a second signal relating to a second light flux portion in a second spectral band received by the second pixel, - a processor (24) configured to compare the first and the second signal and to classify the emitted light source (11) according to the result of the comparison.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
13 January 2022
Publication Number
11/2022
Publication Type
INA
Invention Field
PHYSICS
Status
Email
mahua.ray@remfry.com
Parent Application

Applicants

THALES
Tour Carpe Diem Place des Corolles Esplanade Nord 92400 COURBEVOIE

Inventors

1. THIBOUT, Paul
c/o Thales Optronique SAS 2, avenue Gay Lussac 78995 ELANCOURT CEDEX
2. MIDAVAINE, Thierry
c/o Thales Optronique SAS 2, avenue Gay Lussac 78995 ELANCOURT CEDEX
3. BLOOM, Guillaume
c/o Thales Optronique SAS 2, avenue Gay Lussac 78995 ELANCOURT CEDEX
4. COURCOL, Yves
c/o Thales Optronique SAS 2, avenue Gay Lussac 78995 ELANCOURT CEDEX
5. VERDY, Olivier
c/o Thales Optronique SAS 2, avenue Gay Lussac 78995 ELANCOURT CEDEX

Specification

DESCRIPTION

TITLE: Device for classifying a light source

The present invention relates to a device for classifying a light source. The invention also relates to an optronic system comprising such a classification device.

Optronic systems are conventionally equipped with light flux detection functions, in particular laser flux.

In particular, active optronic systems exploiting a laser emission are of three types: mono-static, bi-static and so-called “point-to-point” systems. Mono-static systems are optronic systems comprising a transmitter and a receiver integrated in the same system or on the same platform. Mono static systems are, for example, laser range finders. Bi-static systems are optronic systems comprising two sub-assemblies: on the one hand, a transmitter and, on the other hand, a receiver integrated respectively in separate systems or platforms. Bi-static systems are, for example, laser spot detectors, laser pointer detectors or laser pointers or designators operating in bi-static mode. So-called “point-to-point” systems are optronic systems for which a laser directly illuminates the receiver. However, the laser can illuminate the receiver on the periphery of its emission lobe by delivering limited illumination. The so-called “point-to-point” systems are, for example, laser warning detectors (abbreviated as DAL) or missile guidance systems (in English “beamriders”).

The detection of laser flux from such optronic systems is subject to numerous problems.

In particular, one of the problems for laser warning detectors consists in reducing the rate of false alarms while allowing precise angular localization of the emission source. Solar reflections (for example, sunlight glinting off leaves or urban structures) are one of the causes of false alarms. Indeed, from the point of view of the laser warning detector, certain solar reflections can have a temporal behavior close to the signal supplied by the laser transmitters. Laser emissions are therefore difficult to identify by algorithms with respect to solar reflections.

To reduce the rate of false alarms, it is therefore necessary to better distinguish the signals of solar origin from the signals of laser origin to be detected.

It is conventionally known to use laser warning detectors in the form of mono-detectors or in the form of four detectors, called four quadrants, or even in the form of strips of several non-multiplexed detectors, having the ability to measure the time shape of the received signal with a large bandwidth. The temporal forms of the laser emissions and of the solar flux being different, such detectors make it possible to more easily classify the illumination information due to a conventional solar flux or due to the laser emission of a rangefinder. Indeed, the pulse duration and repetition frequency of laser emitters is very deterministic, while solar reflections are very random in duration, often much slower, with no repetition rate. Thus, an adapted algorithm makes it possible to differentiate them.

Nevertheless, such laser detectors do not make it possible to provide precise angular localization of the emission source, unlike a laser warning detector using a matrix sensor, but which generates a higher rate of false alarms for it. Indeed, such a laser warning detector with matrix sensor does not make it possible to precisely give the temporal form of the signal received due to its more limited bandwidth, which means that rapid solar reflections are likely to be classified as being a laser signal.

There is therefore a need for a device for classifying a light source having a reduced rate of false alarms while allowing precise angular localization of the emission source.

To this end, the subject of the invention is a device for classifying a light source comprising:

- a sensor capable of receiving a light flux emitted by at least one light source, the light flux being received on the sensor in the form of a spot, the sensor comprising a plurality of pixels grouped together in sets, each set comprising at least one first pixel and a second pixel, adjacent to the first pixel, each first pixel being capable of generating a first signal relating to a first portion of luminous flux in a first spectral band received by said first pixel, each second pixel being capable of generating a second signal relating to a second portion of luminous flux in a second spectral band received by said second pixel, the second spectral band being different from the first spectral band,

- a unit for controlling the size and shape of the spot so that the spot extends over at least one set of pixels of the sensor, and

- a computer configured to compare the first and the second signal and to classify the emission light source according to the result of the comparison.

According to other advantageous aspects of the invention, the classification device comprises one or more of the following characteristics, taken in isolation or in all technically possible combinations:

- the computer is configured to determine the direction of the emission light source relative to the device according to the shape and position of the spot on the sensor;

- each set comprises at least a third pixel from among the plurality of pixels of the sensor, each third pixel being capable of generating a third signal relating to a third portion of luminous flux in a third spectral band received by said third pixel, the third spectral band being different from the first spectral band and the second spectral band, the calculator being configured to compare the first, the second and the third signal to classify the emission light source;

- the arrangement of the pixels of each set is predefined and is advantageously identical from one set to another;

- the first spectral band is a spectral band of interest and the second spectral band is a reference spectral band, each spectral band of interest being centered on a wavelength of interest, each reference spectral band being chosen from the group consisting of:

• a spectral band centered on a wavelength different from the or each wavelength of interest,

• a spectral band disjoint from the or each spectral band of interest, and

• a spectral band in which at least one spectral band of interest is strictly included,

the first pixel comprising a filter capable of transmitting only the first spectral band, the second pixel comprising a filter capable of transmitting only the second spectral band;

- the third spectral band is a spectral band of interest centered on a wavelength of interest different from the first spectral band, or a reference spectral band different from the second spectral band, the third pixel comprising a filter specific to transmit only the third spectral band;

- at least one wavelength of interest is between 1.05 micrometers and 1.07 micrometers or between 1.50 micrometers and 1.70 micrometers, preferably between 1.55 micrometers and 1.65 micrometers;

- the first pixel comprises a red filter of a Bayer matrix or a green filter of a Bayer matrix, the second pixel comprising a blue filter of a Bayer matrix;

- the device comprises, upstream of the sensor, a pupillary filter, the first spectral band being the product of the spectrum of the pupillary filter and of the spectrum of the filter of the first pixel, the second spectral band being the product of the spectrum of the pupillary filter and of the spectrum the filter of the second pixel, the pupil filter preferably being a notch filter in a range of wavelengths comprised between 380 nanometers and 850 nanometers;

- the first spectral band and the second spectral band are separate and have wavelengths between 800 nanometers and 900 nanometers.

The invention also relates to an optronic system comprising a classification device as described previously.

Other characteristics and advantages of the invention will appear on reading the following description of embodiments of the invention, given by way of example only and with reference to the drawings which are:

- [Fig 1], Figure 1 is a schematic representation of a light source and an optronic system comprising a classification device,

- [Fig 2], Figure 2 is a graph illustrating the spectral profiles of three filters,

- [Fig 3], Figure 3 is a schematic representation of a sensor receiving a light flux in the form of a task,

- [Fig 4], figure 4 is a schematic representation of the response of different pixels receiving a solar flux,

- [Fig 5], Figure 5 is a schematic representation of the response of different pixels receiving a luminous flux in a first spectral band,

- [Fig 6], Figure 6 is a schematic representation of the response of different pixels receiving a light flux in a first spectral band, as well as a solar flux,

- [Fig 7], figure 7 is a graph illustrating the spectral profiles of the filters of a Bayer matrix,

- [Fig 8], Figure 8 is a graph illustrating the spectral profile of a pupillary filter,

- [Fig 9], Figure 9 is a schematic representation of a set of pixels, and

- [Fig 10], Figure 10 is a schematic representation of another set of pixels.

General embodiment

In what follows, there is described, with reference to FIG. 1, the general structure of a light source 11 and of an optronic system 12 comprising a classification device 14. The classification device 14 will then be described in greater details in the following description in examples of first, second and third embodiments.

The light source 11 is capable of emitting a luminous flux in different directions and in particular in the direction of the optronic system 12. The luminous flux emitted by the light source 11 has a spectral emission band B1.

The light source 11 is, for example, a monochromatic emitter such as a laser or a light-emitting diode (abbreviated as LED, from English: Light-Emitting Diode, or DEL in French). In another example, the light source 11 is an electric discharge bulb in a low pressure gas. In yet another example, the light source 11 is a broadband spectrum source, such as a flash lamp, the sun, the sky or the scene illuminated by the sky and the sun. In yet another example, the light source 11 comprises artificial lighting using incandescent lamps, fluorescent lamps, electric discharge bulbs in high pressure gases or even LEDs associated with fluorescent compounds.

The optronic system 12 is, for example, an active optronic system, such as a mono-static system, a bi-static system or a so-called “point-to-point” system.

Mono-static systems are optronic systems comprising a transmitter and a receiver integrated in the same system or on the same platform. Mono-static systems are, for example, laser range finders.

Bi-static systems are optronic systems comprising two sub-assemblies: on the one hand, a transmitter and, on the other hand, a receiver integrated respectively in separate systems or platforms. The bi-static system subassemblies are, for example, laser spot detectors, laser pointer detectors or even laser pointers or laser designators operating in bi-static mode.

So-called "point-to-point" systems are optronic systems broken down into two sub-assemblies: a transmitter and a receiver in intervisibility or even on a line of sight or close to a line of sight, for which for example a laser directly illuminates the receiver. However, the laser can illuminate the receiver on the periphery of its emission lobe by delivering limited illumination. The sub-assemblies used in so-called "point-to-point" systems are, for example, laser warning detectors (abbreviated as DAL) or missile guidance systems (in

English “beamriders”) or optical telecommunication devices in free space.

Advantageously, the optronic system 12 and the light source 11 evolve in an external environment on the same scene. A scene designates a theater of operations, that is to say the place where an action takes place. The stage is therefore an extended space with sufficient dimensions to allow an action to take place.

The optronic system 12 is, for example, intended to be integrated into a platform, such as an aircraft, a land vehicle or a ship.

The classification device 14 is configured to classify a light source emitting a luminous flux in the direction of the optronic system 12, such as the light source 11. By classifying a light source, it is understood to identify the nature of the light source 1 1 , which consists at least in identifying whether the light source 1 1 is a laser or not (discrimination of a laser threat from the rest of the scene) and, if this is the case, to give if possible the main characteristics of this laser such as its pulse duration and its repetition rate to try to place it in a category of threat (for example: rangefinder, pointer, designator or missile guidance system).

The classification device 14 comprises a sensor 20, a control unit 22 and a computer 24.

The sensor 20 is a matrix sensor, that is to say a sensor formed from a matrix of pixels.

The sensor 20 is able to receive a light flux emitted by the light source 11. The luminous flux is received on the sensor in the form of a task T.

The sensor 20 comprises a plurality of pixels. Each pixel of the sensor 20 which receives a luminous flux is configured to detect either the luminous flux directly, or a variation in luminous flux, or a pulse, or an energy.

Sensor pixels are grouped into sets. Each set preferably comprises the same number of pixels.

Each set includes at least a first pixel P1 and a second pixel P2. The second pixel P2 is adjacent to the first pixel P1.

Each first pixel P1 is capable of generating a first signal S1 relating to a first portion of luminous flux received by the first pixel P1 in a first spectral band B1.

Each second pixel P2 is capable of generating a second signal S2 relating to a second portion of luminous flux received by said second pixel P2 in a second

spectral band B2. The second spectral band B2 is different from the first spectral band B1.

The signal generated by each pixel is a signal representative of the number of photons per second (also called flux) or else preferably a variation of the number of photons per second or variation of the flux arriving at the pixel.

Each signal has at least one characteristic. The characteristic is, for example, the amplitude of the signal.

Advantageously, each set comprises at least one third pixel P3 among the plurality of pixels of the sensor 20. The third pixel P3 is adjacent to at least one of the first or the second pixel P2 of the set.

Each third pixel P3 is capable of generating a third signal S3 relating to a third portion of luminous flux received by said third pixel P3 in a third spectral band B3. The third spectral band B3 is different from the first spectral band B1 and from the second spectral band B2.

Advantageously, each set comprises a plurality of pixels, such as several first pixels P1 and/or several second pixels P2 and/or several third pixels P3 and/or pixels different from the first, second and third pixels P1, P2, P3. In this case, each pixel comprises a filter which determines the spectral band corresponding to the pixel.

The arrangement of pixels in each set is predefined. Advantageously, the arrangement of the pixels of each set is identical from one set to another. For example, the position of the different types of pixels on the sensor 20 is chosen so as to form a periodic pattern. Advantageously, the different types of pixels are arranged relative to each other according to an interlaced tiling. As a variant, the arrangement of the various pixels is pseudo-random.

For example, the positions of the first pixels on the sensor 20 are chosen so as to form a predefined pattern (for example, a staggered pattern) and the position of the other pixels, in particular of the second pixels, are the positions not occupied by the first pixels (in the example, the voids of the staggered pattern).

The unit 22 for controlling the size and shape of the task 20 is configured to control the size and the shape of the task 20 forming on the sensor 20 from the luminous flux emitted by the light source 11 so that the task 20 extends over at least one set of pixels of the sensor 20.

The control unit 22 is, for example, an optical device configured to distribute over several pixels (for example to defocus) the light flux received by the sensor 20. The control unit 22 comprises, for example, an optical lens, a mechanical element for adjusting the defocusing or else an optical diffuser upstream of the sensor 20.

The computer 24 is, for example, a processor. The computer 24 comprises, for example, a data processing unit, memories, an information carrier reader and a man/machine interface, such as a keyboard or a display.

In the example illustrated by FIG. 1, the computer 24 is carried by the optronic system 12. Alternatively, the computer 24 is remote from the optronic system 12 and is installed in an entity which is, for example, on the ground. This makes it possible to deport the processing carried out by the computer 24 outside the optronic system 12.

The computer 24 interacts with a computer program product which includes an information medium. The information medium is a medium readable by the computer 24, usually by the data processing unit of the computer 24. The readable information medium is a medium suitable for storing electronic instructions and capable of being coupled to a computer system bus. By way of example, the readable information medium is a diskette or floppy disk (from the English name floppy disk), an optical disk, a CD-ROM, a magneto-optical disk, a ROM memory, a RAM memory, an EPROM memory, an EEPROM memory, a magnetic card or an optical card. On the information carrier is stored the computer program product comprising program instructions.

The computer program can be loaded onto the data processing unit and is adapted to cause the implementation of a classification method when the computer program is implemented on the processing unit of the computer 24.

In the following, the interactions between the light source 11 and the optronic system 12 are described, as well as the general operation of the classification device 14. The specific operation of the classification device 14 will be described in more detail in the following section. the description in the first, second and third embodiments.

Initially, the sensor 20 receives a luminous flux in the form of a spot T extending over at least one set of pixels of the sensor 20 as illustrated in FIG. 3. In FIG. 3, the spot T extends over a set of four pixels: a first pixel P1, two second pixels P2 and a third pixel P3.

In response, each pixel of the assembly receiving the luminous flux generates a signal.

The computer 24 then compares the signals generated by the pixels of the set, and in particular the characteristics of such signals, such as the amplitude.

More precisely, the computer 24 compares the characteristics of the first and of the second signal S1 and S2 and classifies the light source 11 of emission according to the result of the comparison.

When each set includes a third pixel P3, the computer 24 is also able to compare the characteristics of the first, the second and the third signal S1, S2, S3 to classify the light source 11.

More generally, the computer 24 compares the characteristics of the signals S1, S2, S3 generated by the adjacent pixels P1, P2 and P3 of each set to classify the light source 11 of emission.

For example, the results obtained at the end of the comparison are compared with a database of results obtained for known light sources, which allows the classification of each detected light source.

The comparisons are, for example, carried out detection by detection, that is to say each time a flux is detected on the sensor 20. For example, the detection is carried out when on the DC components of the flux received by each pixel P1, P2, P3 of a set, at least one of the pixels measures a positive variation of its absolute flux or of its relative flux greater than a threshold in a short period of time. By short period of time is meant a period of time less than ten milliseconds.

As a variant, the comparisons are made after several detections, for example after a duration greater than the duration of a laser shot (of the order of 1 second at most). This facilitates the classification of the light source 11.

Examples of comparison will be described in more detail in the remainder of the description (first, second and third embodiments).

The computer 24 also determines the direction of the emission light source 11 relative to the device 14 (angular location) as a function of the shape and position of the spot T formed on the sensor 20.

For this, the computer 24 estimates, for example, the position of the center of the spot T formed by the luminous flux on the sensor 20. Indeed, the signatures of specific events will have the shape of a disc (or of a small surface square if a square pupil is used) because laser sources are considered to be far away.

For example, knowing the position of the center of the spot T, the computer 24 correlates the shape of the spot T with shapes of predetermined spots associated with a direction. The direction of the emission light source 11 is the direction of the predetermined spot shape for which the best correlation is obtained.

Thus, due to its dual functionality (detection and localization), the classification device makes it possible to reduce the rate of false alarms while allowing precise angular localization of the emission source.

First embodiment

A first embodiment of the classification device 14 described in the general description will now be described with reference to FIGS. 2 to 6.

In this first embodiment, the classification device 14 is particularly suitable for detecting predetermined laser emissions having, as laser emissions, narrow spectral bands centered on wavelengths of interest. A narrow spectral band is defined as a spectral band with a width of less than 100 nanometers (nm). The wavelength of interest is typically the central wavelength of the emission spectral band of the predetermined laser that it is desired to detect.

For example, the wavelength of interest belongs to band 1 infrared, also called SWIR infrared (from the English Short Wave Infra-Red, translated into French by Infrarouge court), that is to say at the range of wavelengths between 0.9 micrometers (pm) and 1.7 pm. More specifically, the wavelength of interest for a first laser family is between 1.06 μm and 1.07 μm and is advantageously equal to 1.064 μm. In another example, for a second family of lasers, the wavelength of interest is between 1.50 μm and 1.70 μm, preferably between 1 μm and 1.65 μm.

In this first embodiment, the first spectral band B1 is a laser spectral band of interest and the second spectral band B2 is a reference spectral band.

Each laser spectral band of interest is centered on a wavelength of interest. Advantageously, each spectral band of interest is a narrow spectral band.

Each reference spectral band (like B2) makes it possible to define an illumination reference for each set of pixels P1, P2, P3 by considering a respective illumination threshold beyond which a detection will be made.

Each reference spectral band (like B2) is chosen from the group consisting of:

- (i): a spectral band centered on a wavelength different from the or each wavelength of interest,

- (ii): a spectral band disjoint from the band or from each spectral band of interest, and

- (iii): a spectral band in which at least one spectral band of interest is strictly included.

Preferably, when the reference spectral band is of type (iii), the reference spectral band is a wide spectral band. A wide spectral band is defined as being a spectral band with a width greater than or equal to 100 nm or even greater than or equal to twice the spectral band or bands of interest corresponding to a pixel of the same set. When the reference spectral band is of type (i) or (ii), the reference spectral band is a wide spectral band or a narrow spectral band.

The first pixel P1 comprises a filter capable of transmitting only the first spectral band B1.

The second pixel P2 comprises a filter capable of transmitting only the second spectral band B2. The filter of the second pixel P2 is, for example, a high-pass filter.

When each set comprises a third pixel P3, the third spectral band B3 is a spectral band of interest centered on a wavelength of interest different from the first spectral band B1 (spectral band of interest) or is a spectral band reference different from the second spectral band B2 (reference spectral band). The third pixel P3 comprises a filter capable of transmitting only the third spectral band of interest B3. Advantageously, when the third spectral band B3 is a spectral band of interest, the filter of the third pixel P3 is a narrow band filter.

When each set comprises pixels with different filters from the first, second and third pixels P1, P2, P3, the spectral bands of the filters of said pixels P1, P2, P3 are spectral bands of interest or reference spectral bands.

An example of arrangement of the first, second and third pixels P1, P2, P3 is illustrated by FIG. 3. In this example, the second pixels P2 (reference pixels) are arranged according to a periodic pattern on the sensor 20 and the first pixels P1 (pixel of interest) and third pixels P3 (pixels of interest or reference) are arranged periodically in the spaces not occupied by the second pixels.

The operation specific to the first embodiment of the classification device 14 will now be described.

In the example illustrated by FIGS. 2 to 6, the first spectral band B1 is a spectral band centered on the wavelength 1.064 μm. The second spectral band

B2 is a broadband spectral band comprising all the wavelengths from 1.020 μm to 1.7 μm (reference spectral band of type (iii)). The third spectral band B3 is a spectral band of interest centered on the wavelength 1.55 μm. Furthermore, in this example, each set comprises four pixels, including at least one first pixel P1, at least one second pixel P2 and at least one third pixel P3.

The sensor 20 receives a luminous flux in the form of a spot T spreading over at least one set of pixels of the sensor 20.

In response, each pixel of each assembly receiving the luminous flux detects either the luminous flux directly, or a variation in luminous flux, or a pulse, or an energy, and then generates a signal.

The computer 24 then compares the signals generated by each pixel of each set, in particular the characteristics of such signals, to classify the light source 11. For this, the computer 24 calculates ratios between the characteristics of the signals generated by the pixels of each set or by related pixels. The characteristics of the signals considered are, for example, the amplitudes of the signals. The computer 24 then compares the ratios obtained with at least one predetermined value to classify the light source 11 for emission.

In the example considered, the computer 24 compares the characteristics of the first, second and third signals S1, S2, S3.

For this, in this example, the computer 24 first calculates an adaptive threshold from the continuous flow, its fluctuation and the noise level of the sensor 20. The adaptive threshold is calculated on the basis of the signals generated by the pixels of the sensor 20 over a spatially and temporally sliding calculation area of ​​N times N pixels of sensor 20 (N being an integer strictly greater than 1). For example, the calculation area includes five times five, seven times seven, or nine times nine pixels. The continuous flux and its fluctuation make it possible to characterize the background noise of the scene (during the day) and the noise of the detector (at night).

In this same example, the computer 24 then identifies the pixels of the sensor 20 corresponding to a spectral band of interest (here the first and third pixels P1, P3) receiving a luminous flux greater than or equal to the adaptive threshold. The fluxes of the pixels identified and corresponding to the same spectral band of interest are then correlated by group of M times M pixels (M being an integer strictly greater than 1). For example, M is two or three. The size of the spot T received on the sensor 20 encompasses in this case at least M times M collocated pixels.

In the area of ​​M times M co-located pixels receiving a stream, the computer 24 compares the characteristics of the signals generated by each pixel or the values

averages of such characteristics (in the case where several pixels are detected by normalizing with respect to the illuminated surface).

For example, when the characteristic is an amplitude, the computer 24 calculates the ratios between the amplitudes of the signals generated by the first, second and third pixels P1, P2, P3, which amounts to calculating the ratios between the first, second and third spectral bands B1, B2, B3. In the example illustrated by FIGS. 2 to 6, the calculated ratios are, on the one hand, the ratio B1/B2 between the first spectral band B1 (of interest) and the second spectral band B2 (of reference) and, on the other hand, the ratio B3/B2 between the third spectral band B3 (of interest) and the second spectral band B2 (of reference).

These ratios are then compared with predetermined values, which makes it possible to determine whether or not the emission source is a laser source centered on the wavelength of interest. Advantageously, the estimation of the equivalent temperature of the light source 11 makes it possible to classify the source more precisely. For example, equivalent temperatures of the order of 5800 Kelvins (K) will make it possible to reject a solar reflection. Temperatures below 2000 K will make it possible to classify muzzle fires or propulsion of rockets or missiles.

By way of illustration, FIGS. 4 to 6 illustrate the responses of the pixels of the sensor 20 for different types of light flux received.

When the luminous flux received by the sensor 20 is a solar flux (configuration 1), the response of the pixels is illustrated by FIG. 4. The first pixel P1 receives an illumination corresponding to the portion of solar flux in the first spectral band B1. The second pixel P2 receives an illumination corresponding to the portion of solar flux in the second spectral band B2. The second spectral band B2 being a wide band, the second pixel P2 therefore receives greater illumination than the first pixel P1. The third pixel P3 receives an illumination corresponding to the portion of solar flux in the third spectral band B3. The third spectral band B3 being a narrow band, the third pixel P3 therefore receives less illumination than the second pixel P2. The ratios B1/B2 and B3/B2 are known because this amounts to performing the ratios of the spectral bands of the pixels. Typically, for a defocusing over three pixels, the B1/B2 ratio is substantially equal to 1/9 and the B3/B2 ratio is substantially equal to 3/9.

When the light flux received by the sensor 20 is only a laser flux corresponding to the wavelength of interest of the first spectral band B1, therefore without parasitic solar reflections (configuration 2), the response of the pixels is illustrated by the figure 5. The first pixel P1 and the second pixel P2 receive the same illumination and therefore have an identical response. The third pixel P3 receives almost zero illumination (within measurement noise) insofar as the first and the third spectral band B3 are disjoint. Thus, the ratio B1/B2 is substantially equal to 1 and the ratio B3/B2 decreases with respect to configuration 1.

When the light flux received by the sensor 20 is a laser flux corresponding to the wavelength of interest of the first spectral band B1 and a solar flux is also received on the sensor 20 (configuration 3), the response of the pixels is illustrated in FIG. 6. The first pixel P1 receives an illumination corresponding to the laser flux in the first spectral band B1 and to the portion of solar reflections in the first spectral band B1. The second pixel P2 receives an illumination corresponding to the laser flux in the first spectral band B1 and to the portion of solar reflections in the second spectral band B2. The second spectral band B2 being a wide band, the second pixel P2 therefore receives greater illumination than the first pixel P1. The third pixel P3 receives an illumination corresponding to the portion of solar reflections in the third spectral band B3. The third spectral band B3 being a narrow band, the third pixel P3 therefore receives less illumination than the first pixel P1. Thus, the B1/B2 ratio increases with respect to configuration 1 and the B3/B2 ratio remains substantially identical to that of configuration 1.

In another example, the third spectral band B3 is a narrow band disjoint from the first and second bands and the compared ratios are modified accordingly.

Thus, the classification device 14 according to the first embodiment makes it possible, by comparing ratios, to distinguish laser emissions (in particular in band 1 or SWIR) from solar reflections, which makes it possible to reduce the rate of false alarms. In particular, in the case of a matrix laser warning detector, the false alarm rate is reduced for the detection of laser flux emitted, for example, by multi-pulse range finders, laser designators, imager illuminators active or otherwise.

Since the sensor 20 of the classification device 14 is a matrix sensor, precise angular location of the emission source is also possible. Furthermore, such a matrix sensor makes it possible to perform two functions: a detection function (allowing the emission source to be classified) and an imaging function.

In addition, the pixels corresponding to the reference spectral bands make it possible to define a local illumination reference of the scene. In particular, during the day, the average solar flux reflected by the scene is taken into account.

In addition, the exploitation of fluxes collected by adjacent pixels allows beyond the classification of monospectral sources such as the lasers sought (with bands B1 and B3), to make measurements relating to fluxes collected in each spectral band. to estimate their equivalent temperature and thus classify reflections of solar origin and sources of pyrotechnic origin (mouth fires, rocket or missile propulsion, pyrotechnic decoys) or even lamps, headlights or beacons.

Thus, the classification device 14 according to the first embodiment makes it possible to reduce the rate of false alarms while allowing precise angular location of the emission source.

Second embodiment

A second embodiment of the classification device 14 described in the general description will now be described with reference to FIGS. 7 to 9.

In the second embodiment, the classification device 14 is particularly suitable for differentiating a laser pointer emitting in the band of wavelengths between 800 nm and 900 nm, from another light source of the urban lighting type emitting at least in the visible (380 nm to 780 nm) and possibly in the near infrared.

In the second embodiment, the first pixel P1 comprises a red filter of a Bayer matrix or a green filter of a Bayer matrix, the second pixel P2 comprises a blue filter of a Bayer matrix. A Bayer array (also called a Bayer filter or Bayer mosaic) is made up of 50% green filters, 25% red filters, and 25% blue filters, so as to mimic the physiology of the human eye. The spectrum of the filters of a Bayer matrix is ​​illustrated in figure 7. In this figure, the curve R is the spectrum of the red filter of the Bayer matrix, the curve V is the spectrum of the green filter of the Bayer matrix and curve B is the spectrum of the Bayer matrix blue filter.

Advantageously, the device 14 comprises, upstream of the sensor 20, a pupillary filter. The pupillary filter is preferably a notch filter in a range of wavelengths between 380 nanometers and 850 nanometers. An example of a pupillary filter spectrum is shown in Figure 8.

The first spectral band B1 is the product of the spectrum of the pupillary filter (if applicable) and of the spectrum of the filter of the first pixel P1. The second spectral band B2 is the product of the spectrum of the pupillary filter (if present) and of the spectrum of the filter of the second pixel P2.

When the set includes a third pixel P3, the third pixel P3 is the other of the red or green filter of the Bayer matrix or a second green filter. The third spectral band B3 is the product of the spectrum of the pupillary filter (if present) and of the spectrum of the filter of the second pixel P2.

When the set includes a fourth pixel P4, the fourth pixel P4 is the remaining pixel of the Bayer matrix. Advantageously, a set can comprise a number greater than four of Bayer matrix pixels. In this case, the spectral band of each pixel is the product of the spectrum of the pupillary filter (if present) and the spectrum of the filter of the pixel.

In the example illustrated in FIG. 9, each set comprises a first pixel P1 with a red filter, a second pixel P2 with a blue filter, a third pixel P3 with a green filter and a fourth pixel P4 with a green filter.

The operation specific to the second embodiment of the classification device 14 will now be described.

The sensor 20 receives a luminous flux in the form of a spot T spreading over at least one set of pixels of the sensor 20.

In response, each pixel of each set receiving the light flux generates a signal.

The computer 24 then compares the characteristics of the signals generated by the pixels of each set to classify the light source 11.

For example, when the characteristic considered is the amplitude, a laser pointer emitting in the wavelength band between 800 nm and 900 nm will have an equivalent amplitude for each of the signals generated by the four pixels of the Bayer matrix. On the other hand, for a "white-bluish" lamp post, the signal generated by the blue pixel will have a higher amplitude than the other pixels. This difference therefore makes it possible to distinguish a laser pointer from a broadband lamppost generally emitting first in the visible and able to extend into the near infrared at least in the 800nm-900nm wavelength band ("white" lamppost -bluish”), and thus to classify the emission light source.

The pupillary filter improves the distinction between a lamppost and a laser pointer. Indeed, without a pupillary filter, a white street lamp will have an almost equivalent level on the pixels as a laser pointer.

Thus, the classification device 14 according to the second embodiment makes it possible to better distinguish a flux emitted by a laser pointer from a flux emitted for example by a street lamp. This reduces the false alarm rate.

Since the sensor 20 of the classification device 14 is a matrix sensor, precise angular location of the emission source is also possible. Furthermore, such a matrix sensor makes it possible to perform two functions: a detection function (allowing the emission source to be classified) and an imaging function.

Thus, the classification device 14 according to the second embodiment makes it possible to reduce the rate of false alarms while allowing precise angular location of the emission source.

Third embodiment

A third embodiment of the classification device 14 described in the general description will now be described with reference to FIG. 10.

In the third embodiment, the classification device 14 is particularly suitable for differentiating a laser pointer emitting in the band of wavelengths comprised between 800 nm and 900 nm from a wideband lamp post generally emitting first in the visible and capable of emitting in the near infrared at least in the 800 nm - 900 nm band.

In the third embodiment, the first spectral band B1 and the second spectral band B2 are separate and have wavelengths between 800 nanometers and 900 nanometers.

The first pixel P1 comprises a filter capable of transmitting only the first spectral band B1. The second pixel P2 comprises a filter capable of transmitting only the second spectral band B2.

When each set includes a third pixel P3, the third spectral band B3 is separate from the first and second spectral bands B1, B2 and has wavelengths between 800 nanometers and 900 nanometers. The third pixel P3 comprises a filter capable of transmitting only the third spectral band B3.

When each set comprises more than three pixels, the spectral band of each pixel is separate from the first, second and third spectral bands B1, B2, B3 and comprises wavelengths comprised between 800 nanometers and 900 nanometers. Each pixel comprises a filter capable of transmitting only the spectral band of the pixel.

In the example illustrated by FIG. 10, each set comprises four distinct pixels: a first pixel P1 corresponding to the range of wavelengths between 800 nm and 825 nm, a second pixel P2 corresponding to the range of wavelengths from wave between 825 nm and 850 nm, a third pixel P3 corresponding to the range of wavelengths between 850 nm and 875 nm and a fourth pixel P4 corresponding to the range of wavelengths between 875 nm and 900 nm.

The operation specific to the third embodiment of the classification device 14 will now be described.

The sensor 20 receives a luminous flux in the form of a spot T spreading over at least one set of pixels of the sensor 20.

In response, each pixel of each set receiving the light flux generates a signal.

The computer 24 then compares the signals generated by the pixels of the set, in particular the characteristics of such signals, to classify the light source 11.

For example, when the characteristic under consideration is amplitude, a laser pointer emitting in the band of wavelengths between 800 nm and 900 nm will have a higher amplitude for the signal generated by one of the pixels because the pointer is a laser emitting in a narrow band. On the other hand, a street lamp will have an equivalent amplitude for each of the signals generated by the pixels of the set because such a street lamp emits over a wide band including the 800 nm-900 nm band. This difference therefore makes it possible to distinguish a laser pointer from a street lamp, and thus to classify the emission light source.

Thus, the classification device 14 according to the third embodiment makes it possible to better distinguish a flux emitted by a laser pointer from the flux emitted by a street lamp. This reduces the false alarm rate.

Since the sensor 20 of the classification device 14 is a matrix sensor, precise angular location of the emission source is also possible. Furthermore, such a matrix sensor makes it possible to perform two functions: a detection function (allowing the emission source to be classified) and an imaging function.

Thus, the classification device 14 according to the third embodiment makes it possible to reduce the rate of false alarms while allowing precise angular location of the emission source. It also makes it possible to classify the broadband sources detected by their spectral classification.

Those skilled in the art will understand that the embodiments described above can be combined to form new embodiments provided that they are technically compatible.

For example, similarly to the second embodiment, a pupillary filter could be added upstream of the sensors of the first and third embodiments.

In another exemplary embodiment, the sensor could comprise, for each set, an interlacing of pixels according to the first, and/or the second and/or the third embodiment.

CLAIMS

1. Classification device (14) of a light source (1 1) comprising:

- a sensor (20) capable of receiving a luminous flux emitted by at least one light source (1 1 ), the luminous flux being received on the sensor (20) in the form of a task (T), the sensor (20 ) comprising a plurality of pixels (P1, P2, P3, P4) grouped into sets, each set comprising at least a first pixel (P1) and a second pixel (P2), adjacent to the first pixel (P1),

each first pixel (P1) being capable of generating a first signal (S1) relating to a first portion of luminous flux in a first spectral band (B1) received by said first pixel (P1),

each second pixel (P2) being capable of generating a second signal (S2) relating to a second portion of luminous flux in a second spectral band (B2) received by said second pixel (P2), the second spectral band (B2) being different of the first spectral band (B1),

- a unit (22) for controlling the size and shape of the spot (T) so that the spot (T) extends over at least one set of pixels (P1, P2, P3, P4) of the sensor (20), and

- a computer (24) configured to compare the first and the second signal (S1, S2) and to classify the emission light source (1 1 ) according to the result of the comparison.

2. Device (14) according to claim 1, wherein the computer (24) is configured to determine the direction of the light source (1 1) of emission relative to the device (14) according to the shape and the position of the task (T) on the sensor (20).

3. Device (14) according to claim 1 or 2, wherein each set comprises at least one third pixel (P3) among the plurality of pixels (P1, P2, P3, P4) of the sensor (20), each third pixel ( P3) being capable of generating a third signal (S3) relating to a third portion of luminous flux in a third spectral band (B3) received by said third pixel (P3), the third spectral band (B3) being different from the first band spectral (B1) and the second spectral band (B2), the computer (24) being configured to compare the first, the second and the third signal (S1, S2, S3) to classify the light source (1 1) of episode.

4. Device (14) according to claim 3, wherein the arrangement of the pixels (P1, P2, P3, P4) of each set is predefined and is advantageously identical from one set to another.

5. Device (14) according to any one of claims 1 to 4, wherein the first spectral band (B1) is a spectral band of interest and the second spectral band (B2) is a reference spectral band, each band spectral band of interest being centered on a wavelength of interest, each reference spectral band being chosen from the group consisting of:

- a spectral band centered on a wavelength different from the or each wavelength of interest,

- a spectral band disjoint from the or each spectral band of interest, and

- a spectral band in which at least one spectral band of interest is strictly included,

the first pixel (P1) comprising a filter capable of transmitting only the first spectral band (B1), the second pixel (P2) comprising a filter capable of transmitting only the second spectral band (B2).

6. Device (14) according to claim 5 in its dependence on claim 3 or 4, in which the third spectral band (B3) is a spectral band of interest centered on a wavelength of interest different from the first spectral band (B1), or a reference spectral band different from the second spectral band (B2), the third pixel (P3) comprising a filter capable of transmitting only the third spectral band (B3).

7. Device (14) according to claim 5 or 6, wherein at least one wavelength of interest is between 1.05 micrometers and 1.07 micrometers or between 1.50 micrometers and 1.70 micrometers, preferably between 1.55 micrometers and 1.65 micrometers.

8. Device (14) according to any one of claims 1 to 4, wherein the first pixel (P1) comprises a red filter of a Bayer matrix or a green filter of a Bayer matrix, the second pixel ( P2) comprising a blue filter of a Bayer matrix.

9. Device (14) according to claim 8, wherein the device (14) comprises, upstream of the sensor (20), a pupillary filter, the first spectral band (B1) being the product of the spectrum of the pupillary filter and the spectrum of the filter of the first pixel (P1), the second spectral band (B2) being the product of the spectrum of the pupillary filter and of the spectrum of the filter of the second pixel (P2), the pupillary filter preferably being a notch filter in a range of wavelengths between 380 nanometers and 850 nanometers.

10. Device (14) according to any one of claims 1 to 4, wherein the first spectral band (B1) and the second spectral band (B2) are separate and have wavelengths between 800 nanometers and 900 nanometers .

11. Optronic system (12) comprising a classification device (14) according to any one of claims 1 to 10.

Documents

Application Documents

# Name Date
1 202217002005-ABSTRACT [08-11-2024(online)].pdf 2024-11-08
1 202217002005.pdf 2022-01-13
2 202217002005-CLAIMS [08-11-2024(online)].pdf 2024-11-08
2 202217002005-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [13-01-2022(online)].pdf 2022-01-13
3 202217002005-STATEMENT OF UNDERTAKING (FORM 3) [13-01-2022(online)].pdf 2022-01-13
3 202217002005-COMPLETE SPECIFICATION [08-11-2024(online)].pdf 2024-11-08
4 202217002005-PRIORITY DOCUMENTS [13-01-2022(online)].pdf 2022-01-13
4 202217002005-DRAWING [08-11-2024(online)].pdf 2024-11-08
5 202217002005-POWER OF AUTHORITY [13-01-2022(online)].pdf 2022-01-13
5 202217002005-FER_SER_REPLY [08-11-2024(online)].pdf 2024-11-08
6 202217002005-FORM-26 [08-11-2024(online)].pdf 2024-11-08
6 202217002005-FORM 1 [13-01-2022(online)].pdf 2022-01-13
7 202217002005-OTHERS [08-11-2024(online)].pdf 2024-11-08
7 202217002005-DRAWINGS [13-01-2022(online)].pdf 2022-01-13
8 202217002005-FORM 3 [03-07-2024(online)].pdf 2024-07-03
8 202217002005-DECLARATION OF INVENTORSHIP (FORM 5) [13-01-2022(online)].pdf 2022-01-13
9 202217002005-COMPLETE SPECIFICATION [13-01-2022(online)].pdf 2022-01-13
9 202217002005-Correspondence-030624.pdf 2024-06-13
10 202217002005-Others-030624.pdf 2024-06-13
10 202217002005-Proof of Right [02-06-2022(online)].pdf 2022-06-02
11 202217002005-FORM 3 [02-06-2022(online)].pdf 2022-06-02
11 202217002005-Proof of Right [28-05-2024(online)].pdf 2024-05-28
12 202217002005-FER.pdf 2024-05-13
12 202217002005-FORM 18 [01-06-2023(online)].pdf 2023-06-01
13 202217002005-PETITION UNDER RULE 137 [28-03-2024(online)].pdf 2024-03-28
13 202217002005-Proof of Right [28-03-2024(online)].pdf 2024-03-28
14 202217002005-PETITION UNDER RULE 137 [28-03-2024(online)].pdf 2024-03-28
14 202217002005-Proof of Right [28-03-2024(online)].pdf 2024-03-28
15 202217002005-FER.pdf 2024-05-13
15 202217002005-FORM 18 [01-06-2023(online)].pdf 2023-06-01
16 202217002005-FORM 3 [02-06-2022(online)].pdf 2022-06-02
16 202217002005-Proof of Right [28-05-2024(online)].pdf 2024-05-28
17 202217002005-Proof of Right [02-06-2022(online)].pdf 2022-06-02
17 202217002005-Others-030624.pdf 2024-06-13
18 202217002005-COMPLETE SPECIFICATION [13-01-2022(online)].pdf 2022-01-13
18 202217002005-Correspondence-030624.pdf 2024-06-13
19 202217002005-DECLARATION OF INVENTORSHIP (FORM 5) [13-01-2022(online)].pdf 2022-01-13
19 202217002005-FORM 3 [03-07-2024(online)].pdf 2024-07-03
20 202217002005-DRAWINGS [13-01-2022(online)].pdf 2022-01-13
20 202217002005-OTHERS [08-11-2024(online)].pdf 2024-11-08
21 202217002005-FORM 1 [13-01-2022(online)].pdf 2022-01-13
21 202217002005-FORM-26 [08-11-2024(online)].pdf 2024-11-08
22 202217002005-FER_SER_REPLY [08-11-2024(online)].pdf 2024-11-08
22 202217002005-POWER OF AUTHORITY [13-01-2022(online)].pdf 2022-01-13
23 202217002005-DRAWING [08-11-2024(online)].pdf 2024-11-08
23 202217002005-PRIORITY DOCUMENTS [13-01-2022(online)].pdf 2022-01-13
24 202217002005-COMPLETE SPECIFICATION [08-11-2024(online)].pdf 2024-11-08
24 202217002005-STATEMENT OF UNDERTAKING (FORM 3) [13-01-2022(online)].pdf 2022-01-13
25 202217002005-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [13-01-2022(online)].pdf 2022-01-13
25 202217002005-CLAIMS [08-11-2024(online)].pdf 2024-11-08
26 202217002005.pdf 2022-01-13
26 202217002005-ABSTRACT [08-11-2024(online)].pdf 2024-11-08

Search Strategy

1 202217002005SearchstratgyE_08-05-2024.pdf