Sign In to Follow Application
View All Documents & Correspondence

Device And Method For Locating A Target

Abstract: The invention concerns a device (1) for locating a target comprising: - a camera (2) that can be oriented in an orientation in view of the target so that the camera acquires an image of the target and an orientation in view of a star so that the camera acquires at least one image of the star - an inertial unit (4) configured to calculate position and orientation data of the camera (2) - a resetting module (6) configured to a apply stellar resetting to the data on the basis of the image of the star in order to produce reset data - a location module (8) configured to estimate a position of the target (T) from the image of the target (T) and the reset data - a communication interface for communicating with an operator station the camera (2) passing from one orientation to the other in response to the reception by the interface of a command sent by the operator station.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
08 July 2019
Publication Number
36/2019
Publication Type
INA
Invention Field
PHYSICS
Status
Email
iprdel@lakshmisri.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-12-15
Renewal Date

Applicants

SAFRAN ELECTRONICS & DEFENSE
18/20 Quai du Point du Jour 92100 BOULOGNE-BILLANCOURT

Inventors

1. ROBERT, Emmanuel
C/o Safran Electronics & Defense 18/20 Quai du Point du Jour 92100 BOULOGNE - BILLANCOURT
2. ROLAND, Flavien
C/o Safran Electronics & Defense 18/20 Quai du Point du Jour 92100 BOULOGNE - BILLANCOURT
3. DAVENEL, Arnaud
C/o Safran Electronics & Defense 18/20 Quai du Point du Jour 92100 BOULOGNE - BILLANCOURT
4. DELEAUX, Benjamin
C/o Safran Electronics & Defense 18/20 Quai du Point du Jour 92100 BOULOGNE - BILLANCOURT
5. ROBERFROID, David
C/o Safran Electronics & Defense 18/20 Quai du Point du Jour 92100 BOULOGNE-BILLANCOURT
6. REYMOND, Georges-Olivier
C/o Safran Electronics & Defense 18/20 Quai du Point du Jour 92100 BOULOGNE - BILLANCOURT

Specification

The present invention relates to a target location of a device intended to be embedded on a carrier and implementing a stellar registration.

STATE OF THE ART

of the prior art discloses a device for locating a target intended to be on board an aircraft, the apparatus comprising:

• a mobile camera configured to be oriented towards a target,

• an inertial measurement unit configured to calculate position data and / or orientation of the camera,

• a positioning module configured to estimate a position of the target using data supplied by the inertial unit.

The device is usually mounted on a wall of the aircraft, so as to allow the localization of targets on the ground when the aircraft is in flight.

Or, the data supplied by the inertial unit can be tainted by drift, so that the estimated position by the tracking unit may be far removed from the actual target position.

To correct such abuses, it is known to carry a stellar viewfinder in the aircraft. Stellar viewfinder is mounted on an upper wall of the aircraft, so as to be facing the sky.

Stellar viewfinder includes a camera that acquires at least a star image whose position is predetermined, and a module configured to apply the data supplied by the inertial unit a treatment called in the literature "stellar registration" or "by resetting referred stellar ". The images acquired by the stellar camera viewfinder can reveal a gap between the supposed position of the star and its actual position. Under stellar resetting, this deviation is used for correcting drift errors in the data calculated by the inertial unit. This correction is effective due to the fact that the star is a reliable reference point.

However, a bearer is subject to mechanical deformation so that the relative position of the camera used to observe the target with respect to stellar viewfinder may vary from unpredictable way, and so diminish the effectiveness of the stellar registration.

Furthermore, the stellar viewfinder is a relatively bulky device, which increases the bearer, which is particularly detrimental when the wearer is a lightweight aircraft drone kind.

Thus, it has been proposed in EP 1,440,329 B1 a method for locating a target using a moving camera.

DISCLOSURE OF INVENTION

An object of the invention is to improve the localization performance of a locating device to be embedded on a carrier, without increasing the wearer.

It is therefore proposed, according to a first aspect of the invention, a device for locating a target to be embedded on a mobile carrier, the device comprising:

• a steerable camera relative to the carrier in

o a first direction for the target to the camera acquires an image of the target, and

o a second direction for at least a predetermined star for the camera to acquire at least one image of the star,

• an inertial measurement unit configured to calculate the position and orientation data of the camera,

· A stellar registration module configured to apply a stellar resetting the data calculated by the inertial navigation system on the basis of the image of the star, so as to produce position data and orientation readjusted,

• a positioning module configured to estimate a position of the target from the image of the target and recalibrated data,

· A communication interface with an operator's station, the camera further being configured to move between first and second orientations to the other of the first and second orientations in response to the reception by the communication interface a command issued by the operator station.

In the proposed scheme, the same camera used to observe the target to locate and acquire or images showing at least a predetermined star that is used for the implementation of the stellar registration. As the economy is an additional camera, the overall dimensions of the locating device is reduced.

In addition, the device of the location performance are not parasitized by mechanical deformations of the carrier.

The process according to the first aspect of the invention can be supplemented with the following characteristics, taken alone or in combination where technically possible.

The locating device may comprise making an uncertainty estimation module configured to estimate uncertainty of an error that may affect the accuracy of the estimated position by the locating module, and the camera being configured to move from the first orientation in the second orientation when the uncertainty exceeds a first predetermined threshold.

Uncertainty can be uncertainty about a position error vitiating the position of the target estimated by the location module.

Uncertainty can alternatively be an uncertainty on camera heading error vitiating the heading data calculated by the inertial unit.

The first threshold may be less than or equal to 0.3 milliradians.

The camera can be configured to move from the second orientation to the first orientation when the second uncertainty exceeds a second predetermined threshold.

The second threshold may be less than or equal to the first threshold.

The camera can be configured to acquire the predetermined image in a star fashion infrared acquisition wherein the camera is sensitive to infrared wave lengths.

It is further proposed according to a third aspect of the invention, an aircraft, such as a drone, having a target tracking device according to the first aspect of the invention.

It is further proposed according to a third aspect of the invention, a method of locating a target, comprising the steps of:

· Orientation of a mobile camera embedded in a mobile carrier in a first orientation in which the target is to the camera,

• calculating, by an inertial unit, position data and orientation of the camera,

• acquisition by the camera at least one image of the target,

· Orientation of the camera in a second orientation relative to the carrier in which a predetermined star is a view of the camera,

• acquiring at least one image of the star by the camera,

• stellar retiming applied to the data calculated by the inertial unit on the basis of the image of the star, so as to produce position data and orientation readjusted,

• estimate of a position of the target from the image of the target and recalibrated data,

wherein the camera moves from a first and second orientations to each other in response to receiving a command from an operator's station.

DESCRIPTION OF FIGURES

Other features, objects and advantages of the invention will become apparent from the following description, which is purely illustrative and not exhaustive, and should be read in conjunction with the accompanying drawings wherein:

"Figure 1 is a view in a vertical plane relative to the ground of a target, a carrier carrying a target tracking device, and a star.

• Figure 2 is a schematic view showing the internal components of the locating device, according to one embodiment of the invention.

• Figure 3 is a view in a horizontal plane relative to the floor of the carrier and target already shown in Figure 1.

• Figure 4 is a flowchart of steps of a method of locating a target according to one embodiment of the invention.

Of all the figures, similar elements bear identical references.

DETAILED DESCRIPTION OF THE INVENTION

A target locating

Referring to Figure 1, a movable carrier such as an aircraft A comprises a locating device 1 is shown also a target T. Figure 1 a star S.

The aircraft A is a drone, helicopter, airplane, etc. In Figure 1, the aircraft is a helicopter.

Referring to Figure 2, the locating device 1 comprises a camera 2, an inertial unit 4, a stellar registration module 6 and a tracking unit 8 of target T.

1 the locating device is mounted to a bottom wall P of the aircraft A, that is to say a wall intended to be facing the ground when the aircraft A is airborne.

Alternatively, the locating device 1 may be mounted on another wall of the aircraft A, for example a top wall of the aircraft A, that is to say a wall intended to be facing the sky when the aircraft A is airborne.

The locating device 1 further comprises a housing 10 mounted rotatably on the wall of the aircraft A, for example via a ball joint 12.

The camera 2, the inertial unit 4, the registration module 6 and the tracking unit 8 are accommodated in the housing 10 and for example stationary with respect thereto.

In particular, the inertial unit 4 is preferably secured to the camera 2.

The camera 2 is movable between several orientations relative to the carrier A.

The camera 2 is firstly capable of being oriented towards the ground.

The camera 2 is further likely to be oriented toward the sky. Preferably, the camera 2 is able to take an orientation in which the optical axis of the camera 2 has a maximum elevation of 30 degrees (that is to say that the optical axis of camera 2 forms a positive angle 30 degrees relative to a horizontal plane parallel to the ground and does not go high toward the zenith).

The camera 2 is mounted to the aircraft A so that the optical axis of the camera 2 can be oriented towards the ground or towards the air, without being obstructed by the wall to which the device is mounted, or more generally be hindered by the body of the aircraft A. the camera is for example mounted to the front edge of the bottom wall P of the aircraft A, as shown in Figure 1, or a lateral edge of the wall P.

In fact, as the camera 2 is fixed to the housing 10, the locating device 1 as a whole is movable in rotation relative to the wall of the aircraft and capable of adopting such an elevation.

The camera 2 includes a lens provided with a reticle. The reticle passes through the optical axis O of the camera 2.

The camera 2 provides a snapshot view of field ( "instantaneous field of view 'or in English IFOV) of less than or equal to 0.1 milliradians. The IFOV field is the field of view associated with a pixel of an image acquired by the camera 2. This camera 2 is adapted to the location of target at very great distances.

Furthermore, the camera 2 is sensitive to wavelengths in the visible and / or infrared, for example infrared band lengths in the SWIR (Short-Wavelength infrared) from 1 to 2 micrometers.

The camera 2 is for example configured in several modes of acquisition, acquisition mode making the camera 2 sensitive to wavelengths own wave this acquisition mode. The camera 2 is for example configured not only in an infrared acquisition mode, wherein it is made responsive to said wavelengths in the infrared region, but also in other configurable user acquisitions (visible, UV, etc. .).

Furthermore, the inertial unit 4 is a device known in itself comprising a plurality of inertial sensors, typically accelerometers and gyros.

The inertial unit 4 is configured to calculate the position and orientation data of the camera 2.

Stellar registration module 6 is known from the prior art, for example from EP 3073223 A1.

The tracking unit 8 is configured to estimate a position of the target T, also known from the prior art.

The tracking unit 8 includes a rangefinder. The rangefinder is configured to estimate the distance between the camera 2 and a target T seen by the camera 2.

The rangefinder can be an active range finder, laser example, known in itself. Alternatively, the rangefinder is passive. It calculates the distance between the camera

2 and the target T based on a digital terrain model in which is the target T.

The device further comprises an uncertainty estimation module 14 configured to estimate uncertainty of an error that may affect the accuracy of a T target position estimated by the location module 8.

The retiming module, tracking, and estimation can be separate physical devices, a single physical device, be different computer programs executed by one or more processor (s) of the device, or may be different parts of the same computer program executed by one or more processor (s) of the device.

The device further comprises a motor 16 for rotating the device relative to the carrier. The device, and in particular the motor 16 is supplied with electrical energy by the wearer.

device further comprises a communication interface with an operator's station.

In the case of an aircraft A, other than a drone, the operator station may be a position of the aircraft A: the communication interface is for example a wired communication interface or a wireless radio communication interface .

The operator station may alternatively be in a ground station or in another carrier as that embeds the device. In this case, the communication interface is a wireless radio communication interface.

The device typically forms a gyro-stabilized ball (BGS) operating autonomously in relation to the aircraft A except its power supply by the aircraft A.

In other embodiments, the modules 6, 8, 14 may be transported in the aircraft.

A method for target location without resetting stellar

It is assumed that the aircraft A is flying. A target T is on the ground.

Referring to Figure 4, the camera 2 is oriented toward the target T, in a first orientation (step 100).

To locate the target T, the following steps are implemented by the device.

The inertial unit 4 calculates the position data and / or orientation of the camera 2 (step 102).

The camera acquires at least one image of the target T (step 104).

The rangefinder (laser active or passive) estimates the distance between the target T seen by the camera 2 and the camera 2.

The tracking unit 8 estimates a position of the target T by combining the estimated distance by the range finder to the orientation and position of the camera 2 data and the acquired image (step 118).

Optionally, is also taken into account in the estimation carried out by the tracking unit 8, an angular deviation between the optical axis of camera 2 and an axis passing through a point of the camera 2 and a point the target T. This angular deviation is calculated based on a difference in pixels, in an image acquired by the camera 2 in the first orientation during the step 104, between the camera reticle 2 passing through the axis optics and a pixel of the target T. This deviation is zero when the reticle is superimposed on the target T in the acquired image. It is for example conceivable to orient the camera 2 in order to obtain such a superposition, thereby not having to take account of this difference in the estimation implemented by the location module 8.

The location module 8 may then use a formula of the following type for such estimation:

= Fi pame (E) + / 2 (i) + f 3 (D)

Or

• Pd b ie is the a position of the target T estimated by the location module 8, the position being expressed in a coordinate system related to the earth,

• E is a navigation state estimated by the inertial unit 4 comprising at least one position data P BGS and at least one orientation guidance given BGS camera 2 provided by the inertial unit 4, these data being for example expressed in a geographical reference frame centered on the device and has an axis pointing to the north of the Earth, an axis pointing to the east, and a third axis, the three axes forming an orthonormal system,

Θ is an angular deviation between the optical axis of camera 2 and an axis passing through a point of the camera 2 and a target point T. This angular deviation is function of the difference in pixels, in an image acquired by the camera 2 in the first orientation, the reticle between the camera 2 and a pixel of the target T (optional as indicated above),

D is the distance measured by the range finder,

The functions ; are predetermined functions.

A method of target localization with stellar recalibration

As indicated in the introduction, the data produced by the inertial unit 4 during step 102 may be subject to errors, including drift errors.

A particularly detrimental error in positional accuracy offered by the device is a mistake in a camera heading 2 calculated by the inertial unit 4. Referring to Figure 3, the cap C is an angle formed between:

· 0-axis of the camera 2 projected in a horizontal plane parallel to the floor (the plane of FIG 3, showing the aircraft and the target seen from above) and

• an axis pointing north N included in the horizontal plane (knowing that the axis S shown in Fig 3 is an axis pointing to the east).

Thus, when the target T is located at a great distance from the aircraft A, a heading error, however small, affects very significantly on the final location error committed by the device.

While the camera 2 adopts the first direction (towards the target T), the estimation module estimates an uncertainty on a heading error of the inertial unit 4 (step 106). The estimation module may be the inertial unit 4 itself: the inertial unit 4 then provides directly, in addition to the position and orientation data from the camera 2, an uncertainty given on a course error.

When the uncertainty on the heading error exceeds a predetermined threshold, the estimation module controls a movement of the camera 2 to a second orientation suitable for a predetermined S star is a view of the camera 2 (step 108) .

Preferably, the first threshold is less than or equal to 0.3 milliradians, for example 0, 1 milliradians.

The second orientation is determined by the estimation module based on orientation data supplied by the inertial unit 4, and position information S of the star which are predetermined.

In the second orientation, the camera 2 acquires at least one image showing the star S (step 1 10).

Because of errors which undermine the orientation of the inertial unit 4 data, there is some deviation in pixels, in the image acquired by the camera 2 in the second direction, between a pixel showing the reticle 2 and the camera a pixel showing the star S. This difference is representative of the positioning errors and orientation of the four inertial unit.

The registration module 6 implements a stellar resetting known from the state of the art on the basis of position data and / or orientation of the camera 2, so as to produce position data and / or recalibrated orientation (step 1 14).

The location module 8 uses the recalibrated data instead of errored data supplied by the inertial unit 4, to estimate the position of the target T during step 1 18 previously mentioned.

Preferably, the camera 2 is configured in its infrared acquisition mode to acquire images showing the star S. This infrared acquisition mode is one that provides images of the star s most sensitive and therefore to improve the correction capability of the stellar resetting particular by reducing the time during which the target T is not observed.

While the camera 2 is in the second orientation, the image acquisition step 1 in 10 and 1 14 are repeated for a star and can also be implemented for at least another star after reorientation camera to this other star in step 108.

The step of estimating uncertainty 106 is further also repeated over time, for example at regular intervals, even while the camera is oriented towards a star.

When the estimation module 14 detects that the heading error drops below a second predetermined threshold, the estimation module controls a movement of the camera 2 to the first orientation (it has previously stored at the time of leave) (step 1 16).

The second threshold is less than or equal to the first threshold.

If the camera 2 was configured in a different mode of acquiring the infrared acquisition mode when crossing the first threshold has been detected, the camera 2 is reconfigured in the original acquisition mode to observe the target T.

Other passage criteria of orientation of the camera 2 to the other can be used.

The camera 2 may for example be configured to move between first and second directions to the other direction (steps 108, 1˝ 16) in response to receipt by the communication interface of a command sent by the station 'operator. For example, it may indeed be urgent to redirect the camera 2 to the target T to observe while uncertainty has not dropped below the second threshold, and this despite the fact that conditions are not optimal to calculate an accurate position of the target T.

In the particular case where the communication interface receives a command to shift from the first orientation to the second orientation, the camera can perform this with a delay command. In some situations, the camera could not see the star if she was immediately redirected into the second position, for example when the aircraft is on the back or that heaven is not in the zone of possible orientations of the camera. Consequently, not only the target is no longer for the camera, but the stellar registration may also not function properly.

Therefore, the camera is preferably configured to wait upon reception of the command, the mobile carrier has an orientation with respect to the star allows the camera to see the star in the second direction, before switching from the first to the second orientation. The camera can for example comprise means for aircraft orientation detection from the ground or the sky, or receive, through its communication interface, information enabling it to take cognizance of this orientation to manage this expectation, from the information available at the mobile carrier.

Moreover, uncertainty other than the uncertainty of the heading error can be used as trigger criteria of a passage from one orientation to another.

For example, it may be provided to use for this purpose an uncertainty on the position of the target T, this uncertainty is calculated by the localization module 8 in addition to the target position estimate T itself. In this case, the first threshold is preferably chosen less than or equal to 10 meters.

Such uncertainty position error, which is in the form of a covariance Cov P0S target , is typically calculated as follows by the estimation module 14:

= Fl.Cov, N <;. fl T + O F2.Cov rr_cam .F2 T + F3.Cov, err telem. F3 T

Fl

of

F2

d

F3

dO

with:

• Cov 1NS : covariance of the navigation E state estimated by the inertial unit (this condition including the position and orientation data estimated in step 102)

• Cov err cam : covariance of noise landmark designation measures in the camera frame 2,

• Cov err telem : covariance of noise in the distance designation measures D between the camera 2 and the target (the laser rangefinder in general).

This equation is a simple sum, since errors are independent.

1 locating device is advantageously implemented in an aircraft A type of carrier, such carrier having generally the task of locating the target at very great distance, in particular a drone. However, this device can be embedded on other types of carriers: land vehicle, ship, etc.

CLAIMS

1. Location device (1) of a target (T) to be embedded on a carrier (A) mobile, the device comprising:

· A camera (2) adjustable relative to the carrier (A):

o a first direction for the target to the camera acquires an image of the target (T), and

o a second direction for at least a predetermined star for the camera to acquire at least one image of the star,

· An inertial unit (4) configured to calculate the position and orientation data of the camera (2),

• a stellar registration module (6) configured to apply a stellar registration data calculated by the inertial unit (4) based on the image of the star, so as to produce position data and orientation recalibrated ,

"A location module (8) configured to estimate a position of the target (T) from the image of the target (T) and recalibrated data,

the device being characterized in that it comprises a communication interface with an operator's station, and in that the camera (2) is further configured to switch between first and second orientations to each other in response upon receipt by the communication interface of a command issued by the operator station.

2. Device (1) according to the preceding claim, wherein the camera is configured to wait upon reception of the command, the mobile carrier has an orientation with respect to the star allows the camera to see the star in the second direction, before switching from the first to the second orientation.

3. Device (1) according to one of the preceding claims, comprising a module for estimating uncertainty (14) configured to estimate uncertainty of an error that may affect the accuracy of the estimated position by the locating module ( 8), and wherein the camera (2) is configured to move from the first orientation to the second orientation when the uncertainty exceeds a first predetermined threshold.

4. Device (1) according to the preceding claim, wherein uncertainty is an uncertainty in position error vitiating the target position (T) estimated by the location module (8).

5. Device (1) according to one of claims 3 to 4, wherein uncertainty is an uncertainty on a heading error of the camera (2) vitiating a heading data calculated by the inertial unit (4).

6. Device (1) according to the preceding claim, wherein the first threshold is less than or equal to 0.3 milliradians.

7. Device (1) according to one of the preceding claims, comprising a module for estimating uncertainty (14) configured to estimate uncertainty of an error that may affect the accuracy of the estimated position by the locating module ( 8), and wherein the camera (2) is configured to move from the second orientation to the first orientation when the second uncertainty exceeds a second predetermined threshold.

8. Device (1) according to Claim 3 and Claim 7 taken in combination, wherein the second threshold is less than or equal to the first threshold.

9. Device (1) according to one of the preceding claims, wherein the camera (2) is configured to acquire the predetermined star image in an infrared acquisition mode in which the camera (2) is responsive to infrared wave lengths.

10. An aircraft (A), such as a drone, comprising a locating device (1) of a target (T) according to one of the preceding claims.

January 1. A method for locating a target (T), comprising the steps of:

• (100) orientation of a camera (2) movably embedded in a carrier (A) movable in a first orientation in which the target (T) is a view of the camera (2),

• calculating (102), by an inertial unit (4), position data and orientation of the camera (2),

• acquisition (104) by the camera (2) at least one image of the target (T),

• orientation (108) of the camera (2) in a second orientation relative to the carrier (A) in which a star is predetermined in view of the camera (2),

• Acquisition (1 10) of at least one image of the star by the camera (2)

• stellar registration (1 14) applied to the data calculated by the inertial unit (4) based on the image of the star, so as to produce position data and orientation readjusted,

• estimating (1 18) of a target position (T) from the image of the target (T) and recalibrated data,

the method being characterized in that the camera (2) passes from a first and second orientations to each other in response to receiving a command from an operator's station.

12. Method according to the preceding claim, wherein the camera waits, after receipt of the order that the movable carrier has an orientation with respect to the star allows the camera to see the star in the second orientation, before to move from the first to the second orientation.

13. A method according to one of claims 1 1 and 12, wherein the carrier (A) is an aircraft, e.g., a drone.

Documents

Orders

Section Controller Decision Date
15&43 Anjali Rani 2023-12-15
15&43 Anjali Rani 2023-12-15

Application Documents

# Name Date
1 201917027330-IntimationOfGrant15-12-2023.pdf 2023-12-15
1 201917027330.pdf 2019-07-08
2 201917027330-PatentCertificate15-12-2023.pdf 2023-12-15
2 201917027330-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [08-07-2019(online)].pdf 2019-07-08
3 201917027330-Written submissions and relevant documents [03-11-2023(online)].pdf 2023-11-03
3 201917027330-STATEMENT OF UNDERTAKING (FORM 3) [08-07-2019(online)].pdf 2019-07-08
4 201917027330-POWER OF AUTHORITY [08-07-2019(online)].pdf 2019-07-08
4 201917027330-PETITION UNDER RULE 137 [26-10-2023(online)].pdf 2023-10-26
5 201917027330-FORM 1 [08-07-2019(online)].pdf 2019-07-08
5 201917027330-certified copy of translation [19-10-2023(online)].pdf 2023-10-19
6 201917027330-FORM-26 [18-10-2023(online)].pdf 2023-10-18
6 201917027330-DRAWINGS [08-07-2019(online)].pdf 2019-07-08
7 201917027330-DECLARATION OF INVENTORSHIP (FORM 5) [08-07-2019(online)].pdf 2019-07-08
7 201917027330-Correspondence to notify the Controller [06-10-2023(online)].pdf 2023-10-06
8 201917027330-US(14)-HearingNotice-(HearingDate-19-10-2023).pdf 2023-10-04
8 201917027330-COMPLETE SPECIFICATION [08-07-2019(online)].pdf 2019-07-08
9 abstract.jpg 2019-08-14
9 Reply from DRDO.pdf 2022-08-31
10 201917027330-Defence-10-06-2022.pdf 2022-06-10
10 201917027330-Proof of Right (MANDATORY) [20-11-2019(online)].pdf 2019-11-20
11 201917027330-CLAIMS [08-06-2022(online)].pdf 2022-06-08
11 201917027330-FORM 3 [20-11-2019(online)].pdf 2019-11-20
12 201917027330-FER_SER_REPLY [08-06-2022(online)].pdf 2022-06-08
12 201917027330-OTHERS-261119.pdf 2019-11-29
13 201917027330-Correspondence-261119.pdf 2019-11-29
13 201917027330-FORM 3 [01-06-2022(online)].pdf 2022-06-01
14 201917027330-FORM 18 [18-11-2020(online)].pdf 2020-11-18
14 201917027330-Information under section 8(2) [01-06-2022(online)].pdf 2022-06-01
15 201917027330-FER.pdf 2021-12-10
16 201917027330-FORM 18 [18-11-2020(online)].pdf 2020-11-18
16 201917027330-Information under section 8(2) [01-06-2022(online)].pdf 2022-06-01
17 201917027330-FORM 3 [01-06-2022(online)].pdf 2022-06-01
17 201917027330-Correspondence-261119.pdf 2019-11-29
18 201917027330-OTHERS-261119.pdf 2019-11-29
18 201917027330-FER_SER_REPLY [08-06-2022(online)].pdf 2022-06-08
19 201917027330-CLAIMS [08-06-2022(online)].pdf 2022-06-08
19 201917027330-FORM 3 [20-11-2019(online)].pdf 2019-11-20
20 201917027330-Defence-10-06-2022.pdf 2022-06-10
20 201917027330-Proof of Right (MANDATORY) [20-11-2019(online)].pdf 2019-11-20
21 abstract.jpg 2019-08-14
21 Reply from DRDO.pdf 2022-08-31
22 201917027330-COMPLETE SPECIFICATION [08-07-2019(online)].pdf 2019-07-08
22 201917027330-US(14)-HearingNotice-(HearingDate-19-10-2023).pdf 2023-10-04
23 201917027330-Correspondence to notify the Controller [06-10-2023(online)].pdf 2023-10-06
23 201917027330-DECLARATION OF INVENTORSHIP (FORM 5) [08-07-2019(online)].pdf 2019-07-08
24 201917027330-DRAWINGS [08-07-2019(online)].pdf 2019-07-08
24 201917027330-FORM-26 [18-10-2023(online)].pdf 2023-10-18
25 201917027330-FORM 1 [08-07-2019(online)].pdf 2019-07-08
25 201917027330-certified copy of translation [19-10-2023(online)].pdf 2023-10-19
26 201917027330-POWER OF AUTHORITY [08-07-2019(online)].pdf 2019-07-08
26 201917027330-PETITION UNDER RULE 137 [26-10-2023(online)].pdf 2023-10-26
27 201917027330-Written submissions and relevant documents [03-11-2023(online)].pdf 2023-11-03
27 201917027330-STATEMENT OF UNDERTAKING (FORM 3) [08-07-2019(online)].pdf 2019-07-08
28 201917027330-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [08-07-2019(online)].pdf 2019-07-08
28 201917027330-PatentCertificate15-12-2023.pdf 2023-12-15
29 201917027330.pdf 2019-07-08
29 201917027330-IntimationOfGrant15-12-2023.pdf 2023-12-15

Search Strategy

1 SearchStretegy-201917027330E_07-12-2021.pdf

ERegister / Renewals

3rd: 15 Jan 2024

From 12/12/2019 - To 12/12/2020

4th: 15 Jan 2024

From 12/12/2020 - To 12/12/2021

5th: 15 Jan 2024

From 12/12/2021 - To 12/12/2022

6th: 15 Jan 2024

From 12/12/2022 - To 12/12/2023

7th: 15 Jan 2024

From 12/12/2023 - To 12/12/2024

8th: 06 Dec 2024

From 12/12/2024 - To 12/12/2025