Sign In to Follow Application
View All Documents & Correspondence

"Method For Detecting The Space Orientation And Position Of An Object"

Abstract: ABSTRACT Method for detecting the space orientation and position of an object. The invention relates to a method for the optical detection of the position and orientation of an object by means of an optical device comprising at least one parallelogram fastened to said object, the optical device comprising optical means and electronic analysis means making it possible to determine the coordinates of the four vertices of the parallelogram A'B'C'D'. in an orthonormal frame w/ith center 0, denoted Ro The principle of the device consists in determining the vertices of the parallelogram A'B'C'D', on the basis of the knowledge of the characteristics of the parallelogram and of four known points of a quadrilateral ABCD. This quadrilateral represents the drawing arising from the projection of the parallelogram A'B'C'D' in a known image plane. The characteristics A'B'C'D' of the parallelogram can be for example its height, its width and the coordinate of one of its points in the frame Ro. FIGURE 5

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 March 2009
Publication Number
28/2009
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2018-09-17
Renewal Date

Applicants

THALES
45 RUE, DE VILLIERS, 92200 NEUILLY SUR SEINE,

Inventors

1. SIEGFRIED ROUZES
26, RUE DES VIGNES DE BUSSAC, 33185 LE HAILLAN,

Specification

Method for detecting the space orientation and position of an object.
The present invention retates to a method for the optical detection 5 of the position and orientation of an object in space. It applies more particularly in the aeronautical field. In this case, the object is a pilot's helmet comprising a helmet viewing system.
The determination of the position and orientation of an object in
10 space is a problem relating to numerous technical fields. The various
solutions generally afforded must have the principal characteristics of
resolving any ambiguity in position or orientation, of responding to more or
less severe dynamics of the systems and of satisfying high accuracy.
15 These systems are used in aeronautics, for detecting head
posture, notably for the helmets of fighter aircraft, of military, civilian or para-civilian helicopters. They are also used for detecting simulation helmets, this detection can then be combined with an oculometry device, also called an eyetracker, for detecting position of the gaze. Numerous applications of these
20 systems also exist in the field of virtual reality and games.
Currently, optical systems for detecting posture rely on two main principles. Firstly, it is possible to identify on an image, produced by a matrix sensor for example, the position of luminous pointlike emitters.
25 Electroluminescent diodes, also called LEDs, can be used as emitters. Additionally, another solution consists in observing an unambiguous pattern printed on the object whose position and orientation are to be determined. For this purpose, one or more cameras are used to observe this pattern and analyze it on the basis of the images collected.
30
In the case of the use of luminous sources of the LED type, the latter are disposed in groups. These groups of LEDs are also called clusters. In the case of aeronautical applications, these clusters, disposed on the helmet, are generally not contained in a plane, and in numerous cases take
35 the form of a tetrahedron on the helmet.

2
Figure 1 represents a helmet 1 used in aeronautics for systems for detecting the position and orientation of objects in space. The diodes 10 placed on the helmet form a tetrahedron-shaped cluster. The tetrahedron is indicated by dashed lines in Figure 1. This type of system requires sensors, generally cameras placed in the cockpit. This entails a multi-emltter/multi-receiver device whose emitters are the diodes and the receivers the cameras.
The analysis of the information arising from the sensors is complex, having regard to the spatial geometry v^hich requires large computational capabilities. Additionally, the slaving of a system of this type may exhibit aspects that are limiting in terms of fastness of the computation time and may therefore affect the accuracy of the systems. To attain the required accuracy, the sensor, of camera type, must have a high resolution and the processing of the sensor information is subject to a prediction of the position of the LEDs and an analysis of zones of interest.
variants of these systems exist, notably devices for detecting the shadow of a grid illuminated by a helmet-mounted source. These systems exhibit a limitation on the accurate determination of the orientation of the object to be identified.
The process of detecting the position and orientation of an object, through the observation of a pattern on said object by cameras, is less accurate. This process requires large computational capabilities and poses problems of use in disturbed environments. One way of improving the performance of such a system is to multiply the sensors and to place them in an optimal manner. This solution nevertheless remains difficult to implement.
In a general manner, the current solutions for detecting the position and orientation of an object In space, in the aeronautical field, exhibit limitations related to the compromise between the implementation of computationally extremely unwieldy solutions and the accuracy requirements demanded. Additionally, the constraints of the aeronautical environment

3
necessitate a redundancy of the optical means or of the sensors and do not allow the implementation of simple technical solutions.
The method according to the invention makes it possible, notably, to alleviate the aforesaid drawbacks. Specifically, the device comprises sensors or emitters grouped into clusters having a parallelogram shape. The method for determining the position of the sensors is, therefore, simple to implement and requires very few computations, the method being deterministic. This method is very advantageous in the case of slaved systems where the times between two measurements are reduced and the accuracy of detection increased.
Advantageously, the method for the optical detection of the position and orientation of an object is carried out by means of an optical device comprising at least one first parallelogram (A'B'C'D') fastened to said object and comprising optical means and electronic analysis means making it possible to determine the coordinates of the four vertices of the first parallelogram (A'B'C'D'), in an orthonormal frame (RQ (0,7,7,ft)), comprising
a center (O), said frame comprising a plane {0,],k) parallel to the image
plane (Pi). The image plane Is without ambiguity the image plane of the optical device considered.
The method comprises several steps;
- a first step of defining a second reference parallelogram (AQBOCODO) whose center (O) is the center of the frame (Ro iOj,j\k)). possessing the same characteristics as the first parallelogram (A'B'C'D'), situated In the plane {Oj,k) parallel to the image plane (Pi);
- a second step of defining the transformation under which the first parallelogram (A'B'C'D') is the image of the second parallelogram (AoBoCoDo), said transformation decomposing into a translation u and a vector rotation r.
- a third step of determining, through the optical means, a quadrilateral (ABCD), obtained by projecting the first parallelogram (A'B'C'D') into the image plane (Pi), with nonzero

4
abscissa Xi, in the frame (RO) with center 0, along a direction {') perpendicular to the image plane (Pi), a fourth step of determining:
o a first point (E) belonging to the image plane (Pi), when it exists, such that the first point (E) Is the intersection of the straight lines formed by two opposite sides of the quadrilateral {AB, CD); o a second point (F) belonging to the image plane (Pi), when it exists, such that the second point (F) is the intersection of the straight lines formed by the other two sides of the quadrilateral (AC, BD), o a first vector (OE), connecting the center of the frame
(0) and the first point (E); o a second vector (OF), connecting the center of the frame (O) and the second point (F); a fifth step of determining the respective images of the unit
vectors (^'-^'*), defining the frame (Ro), by the rotation r, as a function of the first and second vectors (OE^OF) and of the coordinates of the four vertices (AO,BO,CO,DO) of the second parallelogram (AOBOCODO);
a sixth step of determining the translation ii as a function of the first and second vectors (OE.OF) and of the coordinates of the four vertices (Ao,Bo,Co,Do) of the second parallelogram (AoBoCoDo). The knowledge of the translation u and of the rotation r suffices to pinpoint the position of the object, as well as its attitude in space.
finally, a seventh step may be carried out to determine the coordinates of the vertices of the first parallelogram (A*,B',C',D') in Ro, on the basis of the known coordinates of the vertices of the second parallelogram (AO,BO,CO,DO) and of the transformation composed of a translation u and of a rotation r.
Advantageously, the detection method can comprise particular forms of parallelograms such as diamonds, rectangles or squares.

5
Advantageously, the optical detection method can comprise optical means comprising a holographic video-projector emitting, in an image, sharp luminous patterns at every point of the zone of sweep of said object and at least two identical and mutually parallel lineal matrix sensors, disposed on the object, the four ends of these two sensors forming a parallelogram.
Advantageously, the optical detection method can comprise
optical means comprising a camera and at least four emitting diodes
disposed on the object, each of which represents the ends of a
parallelogram.
Other characteristics and advantages of the invention will become apparent with the aid of the description which follows given with regard to the appended drawings which represent:
• Figure 1 represents a pilot's helmet according to the state of the art;
• Figure 2 represents the characteristics of a reference parallelogram;
• Figure 3 represents a 3D view of the drawing of a parallelogram arising from its projection in an image plane;
• Figure 4 represents the vanishing points of the drawing of Figure 3, when they exist;
• Figure 5 represents the vectors, known in Ro, of the vanishing points of the drawing of Figure 3;
• Figure 6 represents an exemplary optical device according to the invention.
The optical detection method according to the invention consists in determining the vertices of a parallelogram A'B'C'D', situated in a frame RQ of space, denoted Ro (0,1, j,k), on the basis of the knowledge of the characteristics of the parallelogram and of four known points of a quadrilateral ABCD. This quadrilateral represents a drawing arising from the projection of the parallelogram A'B'C'D' in an image plane.

6
The characteristics A'B'C'D' of the parallelogram may be for example its height, its width and the coordinate of one of its points in the frame Ro. Of course, any other mode of representation could be appropriate.
This detection of the parallelogram Is done by means of an optical device making It possible, when the parallelogram is fixed to an object, to pinpoint the position and the orientation of the object in Ro,
Figure 2 shows an example of a parallelogram 20 with vertices Ao, Bo, Co and Do and whose characteristics are the same as those of the parallelogram A'B'C'D' whose position and orientation in Ro are to be determined. The parallelogram 20 possesses four sides denoted AQBO, CQDO, AoCo and BODQ that are pairwise mutually parallel. The height 21 of the parallelogram is denoted H, its width 22 is denoted L and the coordinate 23 of Ao in the frame Ro along ] is denoted T.
The four points are defined in Ro by the following equations; OA^=Tj+—k, OB^ = (T-L)j+yk, OC^--OB^ and OD^--OA^.
As indicated in Figure 3, this reference parallelogram is placed in the frame RQ, in such a manner that its center is O. The plane {0,],k) denoted Po is parallel to the plane Pi denoted (Xu],k), the latter being the image plane. The plane Pj contains the drawing ABCD of the quadrilateral where Xj is the abscissa of the plane along the axis J.
It is equivalent to know the coordinates of the four vertices of the parallelogram A'B'C'D' in Ro as to know the analytical transformation which makes it possible to deduce A'B'C'D' from the parallelogram 20.
Given that the two parallelograms have the same characteristics, there exists a direct vector rotation r in relation to an axis passing through O and a translation u , r and w being unique, such that,


7

Figure 3 represents the parallelogram 30 denoted A'B'C'D', in the frame Ro. Its drawing 31, arising from the projection of A'B'C'D' in the plane Pj is represented by the quadrilateral ABCD.
The coordinates of the quadrilateral ABCD In Ro being known through the optical detection method, the algorithm makes it possible on the basis of the drawing 31 and of the reference parallelogram 20, to ascertain the transformations r and w. The position and attitude of the object can be deduced from r and ii directly, without specifically knowing the positions of the vertices of the parallelogram A'B'C'D'.
Figure 4 represents in the plane Pj, the quadrilateral ABCD. When they exist, this corresponding to the most frequent case, the coordinates of the points of intersection of the straight lines (AB) and (CD) and of the straight lines (AD) and (BC) are determined by knowing the coordinates of the points A, B, C, D in Ro- The point of intersection of the straight lines (AB) I and (CD) is then denoted E and the point of intersection of the straight lines (AD) and (BC) is denoted F. In this case, the vector OE is denoted e and the vector OF is denoted/.
It is known that the vector e is positively proportional to A'B' and that the vector / is positively proportional to A'C in Ro.
Figure 5 represents when they exist, the vectors OE and OF in the frame Ro and illustrates the aforesaid property.
The cases, where E does not exist or F does not exist or E and F do not exist, correspond, respectively, to the following relations, which ensue from the geometry of the quadrilateral ABCD:
- the sides AB and CD are parallel. ABCD is a trapezium in relation to AB, that is to say the side A'B' is parallel to the image plane and the side A'C is not. We determine e= AB and / = OF.

8
- the sides BC and AD are parallel, ABCD is a trapezium in
relation to BC, that is to say the side A'C is parallel to the
image plane and the side A'B' is not; We determine f= AC
and e =0E.
- ABCD is a parallelogram, that is to say the parallelogram
A'B'C'D' is parallel to the image plane. We have the following
two relations: e= AB and f= AC.
The following computations are carried out in the case where E and F exist, the simplifications being made naturally for the particular cases where a determined solution exists for each case.


9
The parallelogram A'B'C'D' is deduced by determining the transformation composed of a known vector rotation and of a known translation, of the reference parallelogram AQBOCODO.
In the case where A'B'C'D' is a diamond we have additional relation: |1-/i^| • OF = 11-/^^1 • OE.
In the case where A'B'C'D' is a rectangle we have additional
relation: (OE • O?) = 0.
\n the case where A'B'C'D' is a square, the a'na\Yt\cai\ expressions for the transformations of lj,k under the rotation r are simplified. We
obtain: L = H = 2xT and the rotation of the vector k is determined in a simple manner: r\k] = —^, The two additional relations, corresponding to the case
of the diamond and of the rectangle, are both valid for the case of the square.
Figure 6 is an exemplary device, according to the invention. An object 65 comprising electro-optical receivers, of lineal matrix sensor type and a means for projecting images, said images comprising luminous patterns 61.
The sensors are grouped together in such a manner that pairwise they form parallelograms 30. The sensors are painwise mutually parallel and of equal size.
Additonally, an exemplary means for projecting images, according to the invention, is to use an optical means emitting, at every point of the zone of sweep 66 of the object 65, a sharp image, The sensors placed on the helmet receive unambiguous signals originating from this image.
For this purpose, an exemplary embodiment of the invention uses a holographic video-projector 60 as projection means. Holographic video-projectors such as these are produced and marketed by the company Light Blue Optics and are known under the brand PVPro. This holographic video-

10
projector possesses the advantageous property of emitting a sharp image at every point of the zone of sweep 66.
This holographic video-projector, called VPH hereinafter, comprises a coherent light source, which is generally a laser diode, a display making it possible to produce a phase image, optical means arranged so as to create on the basis of the wave emitted by the light source, a first reference wave and a second wave modulated by the display and means allowing these two waves to be made to interfere. The final image obtained is a Fraunhofer hologram of the phase image generated on the display. It is possible to generate any type of image by this means. The display may be a liquid crystal display, of LCOS type for example.
Under these conditions, the center O of the frame RQ is defined by a point of the VPH, and the plane (Oj,k) is the image plane parallel to the
image plane 32 of the projected image comprising the origin.
So as to pinpoint the object in space, the VPH emits images comprising luminous patterns 61 on the sensors situated on the helmet. The analysis of the information arising from the sensors is carried out by a digital computer 64, placed downstream of the sensors, in the processing chain for treating the signals received.
The analysis of the signals received by each ceil makes it possible to reconstitute the drawing, obtained by projection of the parallelogram positioned on the object in the image plane. The drawing is determined, in an almost natural manner, by photographing the patterns deformed in the local plane of the parallelogram. Knowing the original patterns and their deformations identified by the sensors, the a priori knowledge of the characteristics of the parallelogram give us its drawing inversely. The latter represents a quadrilateral in the image plane.
On the basis of this drawing, and of the knowledge of the characteristics of the parallelogram, a priori known, the method makes it

11
possible to retrieve in a simple manner the position and orientation of the
cluster in the frame R0.
A second variant embodiment is to consider an optical device comprising at least one camera and a pilot's helmet comprising emitting diodes grouped into clusters At least one cluster forms a parallelogram A'B'C'D', whose vertices are diodes.
Under these conditions, the zone of sweep is all or part of the
cockpit.
The center of the frame Ro is the camera, the plane {0,],k) is the image plane of the camera. The camera then obtains, in its image plane, the representation of the quadrilateral ABCD arising from the projection of the parallelogram A'B'C'D' in the image plane.
The analysis means can therefore retrieve the position and orientation of the cluster on the basis of the knowledge of the representation of the quadrilateral in a known image plane and of the a priori known characteristics of the parallelogram.

12
CLAIMS
1. A method for the optical detection of the position and orientation of an object by means of an optical device comprising at least one first parallelogram {A'B'C'D') fastened to said object, whose characteristics are known, and comprising optical means and electronic analysis means making it possible to determine the coordinates of the four vertices of the first parallelogram (A'B'C'D'), in an orthonormal frame (R0 (O.I,J,K)), comprising a center (O), said
frame comprising a plane {0.],k) parallel to the image plane (Pi),
said method being characterized by:
- a first step of defining a second reference parallelogram (AOBQCODO) whose center (O) is the center of the frame (Ro (Oj,],k)), possessing the same characteristics as the first parallelogram (A'B'C'D'), situated in the plane (0,7,^) parallel to the image plane (Pi);
- a second step of defining the transformation under which the first parallelogram (A'B'C'D') is the image of the second parallelogram (AQBOCODO), said transformation decomposing into a translation « and a vector rotation r.
- a third step of determining, through the optical means, a quadrilateral (ABCD), obtained by projecting the first parallelogram (A'B'C'D') into the image plane (Pi), with nonzero abscissa Xi, in the frame (RO) with center 0, along a direction (J) perpendicular to the image plane (PI),
- a fourth step of determining:
o a first point (E) belonging to the image plane (Pi), when it exists, such that the first point (E) is the intersection of the straight lines formed by two opposite sides of the quadrilateral (AB, CD);
o a second point (F) belonging to the image plane (Pi), when it exists, such that the second point (F) is the

13
intersection of the straight lines formed by the other two sides of the quadrilateral (AC,BD), o a first vector [OE), connecting the center of the frame
(O) and the first point (E); o a second vector {OF), connecting the center of the frame (O) and the second point (F); a fifth step of determining the respective images of the unit vectors {Jj,k), defining the frame (Ro). by the rotation (r ) of the
transformation, as a function of the first and second vectors {OE.OF) and of the known characteristics of the second parallelogram (AQBOCODO);
a sixth step of determining the translation («) of the transformation as a function of the first and second vectors (OE,OF), of the vector connecting the center of the frame (0) to a vertex of the quadrilateral (ABCD) and of the known characteristics of the second parallelogram (ADBQCODO).
2. The detection method as claimed in claim 1, characterized in that it comprises a seventh step of determining the coordinates of the vertices {A', B', C, D') of the first parallelogram in the frame (Ro), as a function of the known coordinates of the vertices of the second parallelogram (AOBOCQDO) and of the transformation composed of a translation (u) and of a rotation (r).
3. The detection method as claimed in any one of claims 1 or 2, characterized in that the parallelogram is a diamond.
4. The detection method as claimed in any one of claims 1 or 2, characterized in that the parallelogram is a rectangle.
5. The detection method as claimed in any one of claims 1 or 2, characterized in that the parallelogram is a square.
6. The method of optical detection as claimed in any one of claims 1 to 5, charactehzed in that the device comprises optical means comprising a

14
holographic video-projector emitting in an image plane sharp luminous patterns at every point of the zone of sweep corresponding to the space in which the object may move and at (east two identical and mutually parallel lineal matnx sensors, disposed on the object, the four ends of these two sensors forming a parallelogram.
7. The method of optical detection as claimed in any one of claims 1 to 5,
characterized in that the device comprises optical means comprising a
camera and at least four emitting diodes disposed on the object, each
of which represents the ends of a paralieiogram.
8. The detection method as claimed in any one of the preceding claims,
characterized in that the object is a pilot's helmet, the whole of the

Documents

Application Documents

# Name Date
1 1219-CHENP-2009-RELEVANT DOCUMENTS [21-03-2019(online)].pdf 2019-03-21
1 Wipo Publication Page_As Filed_04-03-2009.pdf 2009-03-04
2 1219-CHENP-2009-IntimationOfGrant17-09-2018.pdf 2018-09-17
2 ISR_As_Filed_04-03-2009.pdf 2009-03-04
3 Form5_As Filed_04-03-2009.pdf 2009-03-04
3 1219-CHENP-2009-PatentCertificate17-09-2018.pdf 2018-09-17
4 Form3_As Filed_04-03-2009.pdf 2009-03-04
4 Abstract_Granted 301069_17-09-2018.pdf 2018-09-17
5 Form26_Power of Attorney_04-03-2009.pdf 2009-03-04
5 Claims_Granted 301069_17-09-2018.pdf 2018-09-17
6 Form1_As Filed_04-03-2009.pdf 2009-03-04
6 Description_Granted 301069_17-09-2018.pdf 2018-09-17
7 Drawing_As Filed_04-03-2009.pdf 2009-03-04
7 Drawings_Granted 301069_17-09-2018.pdf 2018-09-17
8 Marked up Claims_Granted 301069_17-09-2018.pdf 2018-09-17
8 Description Complete_As Filed_04-03-2009.pdf 2009-03-04
9 Correspondence by Agent_Form1_07-09-2018.pdf 2018-09-07
9 Correspondence by Applicant_ purpose_04-03-2009.pdf 2009-03-04
10 1219-CHENP-2009-CLAIMS [06-09-2018(online)].pdf 2018-09-06
10 Claims_As Filed_04-03-2009.pdf 2009-03-04
11 1219-CHENP-2009-COMPLETE SPECIFICATION [06-09-2018(online)].pdf 2018-09-06
11 Abstract_As Filed_04-03-2009.pdf 2009-03-04
12 1219-CHENP-2009-DRAWING [06-09-2018(online)].pdf 2018-09-06
12 Correspondence by Applicant_ purpose_01-06-2009.pdf 2009-06-01
13 1219-CHENP-2009-FER_SER_REPLY [06-09-2018(online)].pdf 2018-09-06
13 Form18_Normal Request_30-08-2010.pdf 2010-08-30
14 1219-CHENP-2009-FORM 3 [06-09-2018(online)].pdf 2018-09-06
14 1219-CHENP-2009-FORM 3 [29-12-2017(online)].pdf 2017-12-29
15 1219-CHENP-2009-FER.pdf 2018-05-11
15 1219-CHENP-2009-FORM-26 [06-09-2018(online)].pdf 2018-09-06
16 1219-CHENP-2009-OTHERS [06-09-2018(online)].pdf 2018-09-06
16 1219-CHENP-2009-PETITION UNDER RULE 137 [30-08-2018(online)].pdf 2018-08-30
17 1219-CHENP-2009-Proof of Right (MANDATORY) [06-09-2018(online)].pdf 2018-09-06
17 1219-CHENP-2009-PETITION UNDER RULE 137 [04-09-2018(online)].pdf 2018-09-04
18 1219-CHENP-2009-PETITION UNDER RULE 137 [04-09-2018(online)].pdf 2018-09-04
18 1219-CHENP-2009-Proof of Right (MANDATORY) [06-09-2018(online)].pdf 2018-09-06
19 1219-CHENP-2009-OTHERS [06-09-2018(online)].pdf 2018-09-06
19 1219-CHENP-2009-PETITION UNDER RULE 137 [30-08-2018(online)].pdf 2018-08-30
20 1219-CHENP-2009-FER.pdf 2018-05-11
20 1219-CHENP-2009-FORM-26 [06-09-2018(online)].pdf 2018-09-06
21 1219-CHENP-2009-FORM 3 [06-09-2018(online)].pdf 2018-09-06
21 1219-CHENP-2009-FORM 3 [29-12-2017(online)].pdf 2017-12-29
22 1219-CHENP-2009-FER_SER_REPLY [06-09-2018(online)].pdf 2018-09-06
22 Form18_Normal Request_30-08-2010.pdf 2010-08-30
23 1219-CHENP-2009-DRAWING [06-09-2018(online)].pdf 2018-09-06
23 Correspondence by Applicant_ purpose_01-06-2009.pdf 2009-06-01
24 Abstract_As Filed_04-03-2009.pdf 2009-03-04
24 1219-CHENP-2009-COMPLETE SPECIFICATION [06-09-2018(online)].pdf 2018-09-06
25 1219-CHENP-2009-CLAIMS [06-09-2018(online)].pdf 2018-09-06
25 Claims_As Filed_04-03-2009.pdf 2009-03-04
26 Correspondence by Agent_Form1_07-09-2018.pdf 2018-09-07
26 Correspondence by Applicant_ purpose_04-03-2009.pdf 2009-03-04
27 Description Complete_As Filed_04-03-2009.pdf 2009-03-04
27 Marked up Claims_Granted 301069_17-09-2018.pdf 2018-09-17
28 Drawings_Granted 301069_17-09-2018.pdf 2018-09-17
28 Drawing_As Filed_04-03-2009.pdf 2009-03-04
29 Description_Granted 301069_17-09-2018.pdf 2018-09-17
29 Form1_As Filed_04-03-2009.pdf 2009-03-04
30 Claims_Granted 301069_17-09-2018.pdf 2018-09-17
30 Form26_Power of Attorney_04-03-2009.pdf 2009-03-04
31 Form3_As Filed_04-03-2009.pdf 2009-03-04
31 Abstract_Granted 301069_17-09-2018.pdf 2018-09-17
32 Form5_As Filed_04-03-2009.pdf 2009-03-04
32 1219-CHENP-2009-PatentCertificate17-09-2018.pdf 2018-09-17
33 ISR_As_Filed_04-03-2009.pdf 2009-03-04
33 1219-CHENP-2009-IntimationOfGrant17-09-2018.pdf 2018-09-17
34 Wipo Publication Page_As Filed_04-03-2009.pdf 2009-03-04
34 1219-CHENP-2009-RELEVANT DOCUMENTS [21-03-2019(online)].pdf 2019-03-21

Search Strategy

1 1219chenp2009_01-09-2017.pdf

ERegister / Renewals

3rd: 10 Oct 2018

From 31/08/2009 - To 31/08/2010

4th: 10 Oct 2018

From 31/08/2010 - To 31/08/2011

5th: 10 Oct 2018

From 31/08/2011 - To 31/08/2012

6th: 10 Oct 2018

From 31/08/2012 - To 31/08/2013

7th: 10 Oct 2018

From 31/08/2013 - To 31/08/2014

8th: 10 Oct 2018

From 31/08/2014 - To 31/08/2015

9th: 10 Oct 2018

From 31/08/2015 - To 31/08/2016

10th: 10 Oct 2018

From 31/08/2016 - To 31/08/2017

11th: 10 Oct 2018

From 31/08/2017 - To 31/08/2018

12th: 10 Oct 2018

From 31/08/2018 - To 31/08/2019