Sign In to Follow Application
View All Documents & Correspondence

A Method And A System For Detecting And Locating An Adjustment Error Or A Defect Of A Rotorcraft Rotor

Abstract: The invention relates to a method of detecting and identifying a defect or an adjustment error of a rotorcraft rotor using an artificial neural network (ANN), the rotor having a plurality of blades and a plurality of adjustment members associated with each blade; the network (ANN) is a supervised competitive learning network (SSON, SCLN, SSOM) having an input to which vibration spectral data measured on the rotorcraft is applied, the network outputting data representative of which rotor blade presents a defect or an adjustment error or data representative of no defect, and where appropriate data representative of the type of defect that has been detected.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
14 November 2007
Publication Number
37/2009
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2020-08-13
Renewal Date

Applicants

EUROCOPTER
AEROPORT INTERNATIONAL MARSEILLE-PROVENCE 13725 MARIGNANE CEDEX

Inventors

1. MOREL, HERVE
2 LOTISSEMENT LES CERISIERS F-13113 LAMANON

Specification

A METHOD AND A SYSTEM FOR DETECTING AND LOCATING AN ADJUSTMENT ERROR OR A DEFECT OF A ROTORCRAFT ROTOR The present invention relates to a method and to a system for detecting and locating an adjustment error or a defect of a rotorcraft rotor. The technical field of the invention is that of manufacturing helicopters - The present invention applies in particular to diagnosing a rotorcraft rotor by analyzing vibration that is generated, at least in part, by the operation of the rotor. In order to measure vibration, the rotorcraft is fitted with accelerometers that are placed (secured) on the casing (s) of the gearbox (es), on the bearings of shafts, and/or on other points of the structure of the helicopter. In flight, the signals delivered by these sensors can be converted into data, and where appropriate synchronized (by using signals delivered by a rotation sensor), and/or "averaged", and then recorded by the embedded system. On returning to the ground, the recorded data can be collated and analyzed. Interpreting such data is complex: it requires lengthy intervention by an expert. Known tools for automatically analyzing such data in order to diagnose a mechanical defect in a mechanism are incomplete and imperfect; there are existing defects which such tools fail to detect, and they sometimes generate indications of a defect when none is justified. An object of the invention is to provide a method of analyzing such data, an analysis program, and a device including the program, making it possible to draw up quickly a diagnosis that is reliable, i.e. that maximizes the percentage of real defects detected while minimizing the percentage of defect detections that are not confirmed. A rotorcraft rotor, in particular a propulsion and lift rotor of a helicopter, conventionally comprises a plurality of mechanical members that are adjustable or displaceable, referred to as adjustment members, having setting or configuration values that have a considerable influence on the vibration produced while the rotorcraft is in operation; these members include balance flyweights secured to each blade or to each structure (sleeve) for securing a blade to the rotor hub and of mass that can be adjusted, tabs secured to each blade and of orientation that can be modified, and members for adjusting the lengths of pitch control rods respectively associated with each of the blades. Among the defects of mechanical elements of a rotor that have an influence on a vibratory signature of the helicopter, mention can be made of slack in bearings and in fastenings, and also to degraded mechanical characteristics of a part due to aging, such as a change in the stiffness or in the damping provided by a lag damper, for example, These adjustment means can be used for adjusting the respective resonant frequencies of the blades corresponding to their second mode in flapping, as described in patent US-6,311,924. Patents WO-02/090903, US-2005-125103, and US-2006-058927 describe methods of detecting adjustment errors or defects of a rotorcraft rotor in order to adjust the adjustment members so as to minimize vibration levels. In those methods, a neural network is used illustrative of the relationship between accelerations (vibrations) and adjustment parameters or defects. The acquisition of vibration measurements from which vibratory signatures are calculated generally requires flights to be performed in the presence of defective mechanical elements on the rotor in various different configurations of defects and adjustment errors: a properly adjusted defect-free rotor; a rotor without any defects but with adjustment error concerning its flyweights, its pitch rods, and/or its tabs; and a rotor without adjustment error but including a defective member. An artificial neural network (ANN) is a calculation model of design inspired on the operation of biological neurons, and it is generally optimized by a statistical type learning method. An ANN generally comprises a multitude of calculation units of identical structure referred to as artificial neurons (AN) that are organized in layers. Each AN is generally characterized by synaptic weights, a combination or aggregation function (e.g. summing) of inputs weighted by the synaptic weights, and an activation function that defines the output from the neuron as a function of the result of the combination function when compared with a threshold. Each ANN is characterized in particular by the topology of the connections between its neurons, the types of combination and activation functions, and by the learning method, i.e. iterative modification of the values of the synaptic weights. These methods include supervised learning methods in which the ANN is forced to converge on a predefined state or output, and non-supervised methods. Among such methods, a distinction is also made between competitive learning methods in which only a fraction of the weights are modified during an iteration, i.e. only the weights of a "winning" or "elected" neuron, possibly together with the weights of neurons close to the elected neuron. A self-organized map (SOM), or Kohonen map, is a particular ANN generally comprising a single layer of neurons with a binary output (in particular equal to zero or one), and in which non-supervised learning is competitive: at each instant, only one neuron, in theory, is "active" or "elected", i.e. the neuron having weights that are the closest to the input data under consideration. The documents "Diagnostic de defauts des rotors d'helicopteres: approches connexionnistes pour 1'analyse vibratoire" [Diagnosing helicopter rotor defects: connectionist approaches for vibratory analysis] by H. Morel, in Rapport DEA Modelisation et conception Assistee par Ordinateur, published by Ecole Nationale Superieure des Arts et Metiers, UMR CNRS 6168, 2003, and "Defect detection and tracing on helicopter rotor by artificial neural networks" by H. Morel et al.. Advanced Process control Applications for Industry Workshop, APC2005, IEEE, Vancouver, 2005, propose using self-organizing maps to diagnose helicopter rotor defects. In spite of certain advantages provided by those methods, there remains a need for methods that are reliable and effective in detecting and identifying (locating) one or more members, in particular adjustment members, that are responsible for high levels of vibration, in order to reduce the cost of (regular) inspection operations to check that a rotorcraft is operating properly. The invention seeks to satisfy this need- A particular object of the invention is to propose such methods, and also programs, devices, and systems implementing these methods, that are improved and/or that remedy, at least in part, the shortcomings or drawbacks of prior art diagnosis methods and systems. According to an aspect of the invention, it is proposed to make use of an ANN with supervised competitive learning (referenced below as "SCLN", "SSON", or "SSOM") in order to detect whether or not a rotorcraft rotor includes a defect or adjustment error in order to determine, where appropriate, which blade (i.e. which angular sector corresponding to a blade) of the rotor presents the defect or adjustment error that has been detected, and also the nature or the location of the defect or adjustment error that has been detected. Competitive learning enables the algorithm to avoid converging on a "steady" state that does not correspond to a learning error minimum, and this is not possible with various other algorithms such as an error back-propagation algorithm. In general, an ANN with competitive learning is better adapted to diagnosing defects or adjustment errors than are other ANNs, such as multilayer perceptrons or networks with radial basis functions, It has been found that an ANN with supervised competitive learning makes it easier to detect and distinguish defects and adjustment errors of a rotorcraft rotor on the basis of modulus and phase data concerning vibration measured on the rotorcraft. In an implementation of the invention, a measured vibration spectrum data sequence is applied to the input of the network, i.e. a sequence of (modulus and phase) acceleration data pairs corresponding to certain frequencies, and in particular to certain harmonics of the frequency of rotation of the rotor, the network outputting data representative of which rotor blade presents a defect or an adjustment error, or data representative of the absence of any such defect or adjustment error, and where appropriate data representative of the type of defect or adjustment error that has been detected (from a set of predetermined types of defect or adjustment error). Each spectral data sequence, which can be referred to as a "spectral signature", forms a vector in which each component is one data measurement, i.e. one modulus or phase value of the measured acceleration at a determined frequency or harmonic. In an aspect of the invention, the SCLN includes a "competitive" layer of "competitive" neurons, each having synaptic weights of number equal to the (common) dimension of the spectral data vectors used, the number of competitive neurons being not less than the dimension of said vectors, and preferably greater than said dimension. In an embodiment, the ratio of the number of competitive neurons to the dimension of the data vectors is of the order of at least two or three, in particular of the order of five, ten, or twenty, approximately. Putting these competitive neurons into competition consists in determining the neuron whose associated vector, i.e. sequence, of synaptic weights is closest to a spectral data vector presented to the input of the layer of competitive neurons. For this purpose, it is possible in particular to calculate a Euclidean distance between these vectors. After determining the respective proximities of the competitive neurons with a data vector presented at the input and while the ANN is learning, there are several methods for iteratively modifying the synaptic weights of the competitive neurons that can be used. These methods differ in particular by whether or not weights of one or more neurons distinct from the elected neuron is/are modified, and also by the modification method that is selected. In an embodiment, a supervised vector quantization algorithm is used for this purpose, in particular an algorithm selected from algorithms of the following types: LVQl, OLVQl, LVQ2, LVQ2.1, LVQ3, MLVQ3, GLVQ, DSLVQ, RLVQ, GRLVQ. Amongst these algorithms, it is possible to select the algorithm that presents a high classification rate and/or low variance, with the help of a method of statistical validation by resampling, of the cross-validation type or the bootstrap type. In another implementation, a supervised self-organizing network (SSON) is used in which the dimension of its output space, that could be equal to one, is preferably equal to two or three; when this dimension is equal to two, such a network can be qualified as a supervised self-organizing map (SSOM). An SSON is characterized in particular by using a neighborhood function for weighting the modification to the synaptic weights of a competitive neuron as a function of the distance between the elected neuron and the competitive neuron as "measured" in the output space. Because of the supervision, images of vectors presented as inputs and constituting members of distinct classes of defect or adjustment error do not become interleaved and/or superposed in the output space. Consequently, when the network is being used operationally to classify a signature, it is easier to interpret results. Various methods can be used for supervising such algorithms. In a first method, data is added (concatenated) to each spectral data vector used while the network is learning, which data is representative of whether the vector in question is a member of a class corresponding to a determined type of defect or adjustment error. In another method, which is applicable to an SSON, a partitioning of the output space into subspaces is defined, and the weights of the competitive neurons are modified as a function of whether they are members of one or another of these subspaces- In a preferred implementation, this partitioning presents a "configuration" that is regular (equal-area), radial, and centered, and the number of subspaces of the partitioning is equal to the number of classes (types) of defect and/or adjustment error, plus one. This partitioning can be defined by determining, for all or some of the classes, the coordinates in the output space of a "setpoint" neuron that is associated with a class of defect or adjustment error; the boundary between two adjacent subspaces associated respectively with two setpoint neurons can be a Voronoi boundary. While the network is learning, this partitioning can be used in particular in application of one or the other of the following two methods: • the neighborhood function can be centered on the neuron corresponding to the barycenter in the output space of the elected neuron and of the setpoint neuron corresponding to the class of the current spectral data vector; or else • competition can be restricted to the neurons of the subspace corresponding to the class of the current spectral data vector. In other implementations of the invention, a plurality of networks are used that are connected in series, connected in parallel, and/or redundant. In a particular implementation of the invention, a method is provided of detecting and identifying a defect or an adjustment error of a rotorcraft rotor, the rotor having a plurality of blades and a plurality of adjustment members associated with each blade, which method comprises the following steps: using a supervised first competitive learning network SCLNl to determine data identifying at least one "defective" sector of the rotor, and in particular a blade that is defective or out of adjustment; and using an (optionally supervised) second competitive learning network to determine data identifying at least one defect or adjustment error present in the identified defective sector (blade). At least some of the operations during the learning stage and/or during the diagnosis stage of a method of the invention can be implemented by an electronic data processor unit, such as a computer operating under the control of a program. Thus, in another aspect of the invention, a program is provided comprising code stored on a medium, such as a memory, or embodied by a signal, the code being readable and/or executable by at least one data processor unit, such as a processor on board a rotorcraft, in order to detect and locate any defects or adjustment errors of a rotorcraft rotor, the code comprising code segments for performing respective operations of a method of the invention. In another aspect, the invention provides a diagnosis system for a rotorcraft rotor, the system comprising: a read member for reading a data medium and arranged to read data corresponding to measurements taken on the rotorcraft; a database containing reference vibratory signature data for the rotorcraft; a device for transforming the measurement data from the time domain to the frequency domain, which device is connected to the read member to receive therefrom the measurement data and to output vibratory signatures for analysis; and • a calculation member connected to the database and to the transformation device and programmed to perform the operations of a method of the invention. Other aspects, characteristics, and advantages of the invention appear in the following description which refers to the accompanying drawings that show preferred implementations of the invention without any limiting character. Figure 1 is a diagram of an SCLN with learning by vector quantization, using a vectorial representation for inputs and outputs and a matrix representation for layers of neurons. Figure 2 is a diagram showing an SSON in which the neurons of the output space of dimension two are represented in the form of disks superposed on nodes in a plane square mesh shown in perspective. Figures 3 to 6 are diagrams showing various mesh configurations for an output space of an SSOM, and the positions of the reference and/or setpoint neurons associated with portions of said space. Figures 7 to 14 are diagrams showing various partitionings of an output space of an SSOM and the corresponding Voronoi diagrams, together with the positions of neurons representative of defect classes and setpoint neurons. Figure 15 is a diagram showing a redundant diagnosis system. Figure 16 is a diagram of a diagnosis system with serial architecture. Figure 17 shows another example of a diagnosis system, but with parallel architecture. The classification of spectral data of unknown class for detecting a possible defect or adjustment error of a rotorcraft rotor requires prior learning by an ANN. Supervised learning consists in establishing a nonlinear algebraic relationship (by calculating synaptic weights), on the basis of pairs of examples associated two by two: the input vectors (observations) and the output vectors (setpoints). The input vectors characterize the states of a rotor, and the output vectors represent the membership class of the input vector. The purpose of the supervised learning algorithm is to minimize calculated learning error as a function of the synaptic weights of the ANN and as a function of the input and output vectors. Non-supervised learning is generally used to classify data. It consists in calculating synaptic weights solely on the basis of input vectors having similarity criteria that are unknown a priori, but that are estimated with the help of a metric (e.g. Euclidean distance). With reference to Figure 1, an ANN based on vector quantization is generally made up of a "hidden" competitive layer COMPET followed by a classification layer CLAS. The competitive layer contains a number Nc of hidden neurons, the classification layer contains a number 3 of output neurons, each representative of a membership class of vectors x(X;j^, ..-, Xp) forming a learning database. The matrix CW represents the matrix of synaptic weights to be calculated during learning; it is constituted by Nc "weight" vectors m^ of dimension p: dim(mi) = dim(x) . The matrix LW represents the matrix of the 3 prototype vectors LW^ ("codebook") of dimension Nc: dim(LWi) = Nc. Each prototype vector encodes a class by means of its components equal to 1 in a quantity proportional to the ratio Nc/q, and equal to 0 elsewhere, with the logic sums of the prototype vectors LW^ two by two being zero. Learning by vector quantization (VQ) is generally of the non-supervised competitive type, the membership information of the vectors of the training database not being used and/or known. The VQ algorithm comprises the following steps: selecting the initial learning vectors representing 3 classes; • selecting the parameters (Nc, q, CW, LW) of the structure of the ANN, and then initializing the parameters CW and LW; selecting the metric used for measuring similarity; • selecting a learning rate a(t); and • for each iteration t. and for each vector x: determining which neuron in the competitive layer is the elected neuron c in the minimum distance sense (generally the Euclidean distance): c = arginin{||x-m,||} " modifying the state of the weight vector m^ of the neuron c using the following learning rule: The effect of this rule is to bring the weight vector m^. (of the elected neuron c) closer to the vector x to be learnt. The number of iterations can be set empirically, or can be set once the classifying rate of a set of unlearned vectors reaches a threshold. During the stage of classifying a vector x, the election process is identical, and then the output from the elected neuron £ (winner) is set to 1 (0 elsewhere) to form a vector al. The matrix product between al and LW activates the output corresponding thereto to the class estimated for x. The algorithms of the learning vector quantization (LVQ) class are supervised versions of the vector quantization VQ algorithm. Unlike the VQ algorithm, if the elected neuron c corresponds to a class other than the membership class of the vector x that is presented, then the associated weight vector m^ will be remote. The conditions for initializing and electing competing neurons in LVQ algorithms are identical to those of the VQ algorithm; nevertheless, the way in which weight vectors are modified is different, as explained in detail below. The rule for modifying the weight vector m^ by the learning vector LVQl is applied under the following conditions: providing the elected neuron c and the input vector x are of identical class, or The optimized LVQl algorithm OLVQl is a variant of LVQl to which there is given an optimum learning rate ac(t) specific to each class in order to accelerate convergence. The learning rule is as follows: where s(t) designates the classification of m^,(t): s(t)=l if the classification is correct, s(t)=-l otherwise- If the learning rate ac(t) increases, it is imperative to limit ac(t)<1 in order to avoid diverging. During initialization, ai(t) must be defined so that 0.3

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 2638-CHE-2007 FORM-18 28-09-2010.pdf 2010-09-28
1 2638-CHE-2007-IntimationOfGrant13-08-2020.pdf 2020-08-13
2 2638-che-2007-form 5.pdf 2011-09-04
2 2638-CHE-2007-PatentCertificate13-08-2020.pdf 2020-08-13
3 2638-CHE-2007_Abstract_Granted_344032_13-08-2020.pdf 2020-08-13
3 2638-che-2007-form 3.pdf 2011-09-04
4 2638-CHE-2007_Claims_Granted_344032_13-08-2020.pdf 2020-08-13
4 2638-che-2007-form 1.pdf 2011-09-04
5 2638-CHE-2007_Description_Granted_344032_13-08-2020.pdf 2020-08-13
5 2638-che-2007-drawings.pdf 2011-09-04
6 2638-CHE-2007_Drawings_Granted_344032_13-08-2020.pdf 2020-08-13
6 2638-che-2007-description(complete).pdf 2011-09-04
7 2638-CHE-2007_Marked Up Claims_Granted_344032_13-08-2020.pdf 2020-08-13
7 2638-che-2007-correspondnece-others.pdf 2011-09-04
8 2638-CHE-2007-Correspondence_26-02-2020.pdf 2020-02-26
8 2638-che-2007-claims.pdf 2011-09-04
9 2638-che-2007-abstract.pdf 2011-09-04
9 2638-CHE-2007-Form26_General Power of Attorney_26-02-2020.pdf 2020-02-26
10 2638-CHE-2007 FORM-13 08-04-2014.pdf 2014-04-08
10 2638-CHE-2007-2. Marked Copy under Rule 14(2) [21-02-2020(online)].pdf 2020-02-21
11 2638-CHE-2007 CORRESPONDENCE OTHERS 08-04-2014.pdf 2014-04-08
11 2638-CHE-2007-Annexure [21-02-2020(online)].pdf 2020-02-21
12 2638-CHE-2007 POWER OF ATTORNEY 08-04-2014.pdf 2014-04-08
12 2638-CHE-2007-Certified Copy of Priority Document [21-02-2020(online)].pdf 2020-02-21
13 2638-CHE-2007-FORM 3 [21-02-2020(online)].pdf 2020-02-21
13 Form 13.pdf 2014-04-11
14 2638-CHE-2007-FER.pdf 2018-05-15
14 2638-CHE-2007-FORM-26 [21-02-2020(online)].pdf 2020-02-21
15 2638-CHE-2007-Information under section 8(2) [21-02-2020(online)].pdf 2020-02-21
15 2638-CHE-2007-Proof of Right (MANDATORY) [13-11-2018(online)].pdf 2018-11-13
16 2638-CHE-2007-PETITION UNDER RULE 137 [13-11-2018(online)].pdf 2018-11-13
16 2638-CHE-2007-Retyped Pages under Rule 14(1) [21-02-2020(online)].pdf 2020-02-21
17 2638-CHE-2007-Verified English translation [21-02-2020(online)].pdf 2020-02-21
17 2638-CHE-2007-OTHERS [13-11-2018(online)].pdf 2018-11-13
18 2638-CHE-2007-Information under section 8(2) (MANDATORY) [13-11-2018(online)].pdf 2018-11-13
18 2638-CHE-2007-Written submissions and relevant documents [21-02-2020(online)].pdf 2020-02-21
19 2638-CHE-2007-Correspondence_14-02-2020.pdf 2020-02-14
19 2638-CHE-2007-FORM 3 [13-11-2018(online)].pdf 2018-11-13
20 2638-CHE-2007-FER_SER_REPLY [13-11-2018(online)].pdf 2018-11-13
20 2638-CHE-2007-Form26_Power of Attorney_14-02-2020.pdf 2020-02-14
21 2638-CHE-2007-Correspondence to notify the Controller [07-02-2020(online)].pdf 2020-02-07
21 2638-CHE-2007-DRAWING [13-11-2018(online)].pdf 2018-11-13
22 2638-CHE-2007-COMPLETE SPECIFICATION [13-11-2018(online)].pdf 2018-11-13
22 2638-CHE-2007-FORM-26 [07-02-2020(online)].pdf 2020-02-07
23 2638-CHE-2007-CLAIMS [13-11-2018(online)].pdf 2018-11-13
23 2638-CHE-2007-HearingNoticeLetter-(DateOfHearing-12-02-2020).pdf 2020-01-29
24 Correspondence by Agent_Form-1_22-11-2018.pdf 2018-11-22
24 2638-CHE-2007-ABSTRACT [13-11-2018(online)].pdf 2018-11-13
25 2638-CHE-2007-FORM 3 [15-11-2018(online)].pdf 2018-11-15
26 2638-CHE-2007-ABSTRACT [13-11-2018(online)].pdf 2018-11-13
26 Correspondence by Agent_Form-1_22-11-2018.pdf 2018-11-22
27 2638-CHE-2007-CLAIMS [13-11-2018(online)].pdf 2018-11-13
27 2638-CHE-2007-HearingNoticeLetter-(DateOfHearing-12-02-2020).pdf 2020-01-29
28 2638-CHE-2007-COMPLETE SPECIFICATION [13-11-2018(online)].pdf 2018-11-13
28 2638-CHE-2007-FORM-26 [07-02-2020(online)].pdf 2020-02-07
29 2638-CHE-2007-Correspondence to notify the Controller [07-02-2020(online)].pdf 2020-02-07
29 2638-CHE-2007-DRAWING [13-11-2018(online)].pdf 2018-11-13
30 2638-CHE-2007-FER_SER_REPLY [13-11-2018(online)].pdf 2018-11-13
30 2638-CHE-2007-Form26_Power of Attorney_14-02-2020.pdf 2020-02-14
31 2638-CHE-2007-Correspondence_14-02-2020.pdf 2020-02-14
31 2638-CHE-2007-FORM 3 [13-11-2018(online)].pdf 2018-11-13
32 2638-CHE-2007-Information under section 8(2) (MANDATORY) [13-11-2018(online)].pdf 2018-11-13
32 2638-CHE-2007-Written submissions and relevant documents [21-02-2020(online)].pdf 2020-02-21
33 2638-CHE-2007-OTHERS [13-11-2018(online)].pdf 2018-11-13
33 2638-CHE-2007-Verified English translation [21-02-2020(online)].pdf 2020-02-21
34 2638-CHE-2007-PETITION UNDER RULE 137 [13-11-2018(online)].pdf 2018-11-13
34 2638-CHE-2007-Retyped Pages under Rule 14(1) [21-02-2020(online)].pdf 2020-02-21
35 2638-CHE-2007-Proof of Right (MANDATORY) [13-11-2018(online)].pdf 2018-11-13
35 2638-CHE-2007-Information under section 8(2) [21-02-2020(online)].pdf 2020-02-21
36 2638-CHE-2007-FORM-26 [21-02-2020(online)].pdf 2020-02-21
36 2638-CHE-2007-FER.pdf 2018-05-15
37 2638-CHE-2007-FORM 3 [21-02-2020(online)].pdf 2020-02-21
37 Form 13.pdf 2014-04-11
38 2638-CHE-2007 POWER OF ATTORNEY 08-04-2014.pdf 2014-04-08
38 2638-CHE-2007-Certified Copy of Priority Document [21-02-2020(online)].pdf 2020-02-21
39 2638-CHE-2007 CORRESPONDENCE OTHERS 08-04-2014.pdf 2014-04-08
39 2638-CHE-2007-Annexure [21-02-2020(online)].pdf 2020-02-21
40 2638-CHE-2007 FORM-13 08-04-2014.pdf 2014-04-08
40 2638-CHE-2007-2. Marked Copy under Rule 14(2) [21-02-2020(online)].pdf 2020-02-21
41 2638-che-2007-abstract.pdf 2011-09-04
41 2638-CHE-2007-Form26_General Power of Attorney_26-02-2020.pdf 2020-02-26
42 2638-che-2007-claims.pdf 2011-09-04
42 2638-CHE-2007-Correspondence_26-02-2020.pdf 2020-02-26
43 2638-che-2007-correspondnece-others.pdf 2011-09-04
43 2638-CHE-2007_Marked Up Claims_Granted_344032_13-08-2020.pdf 2020-08-13
44 2638-che-2007-description(complete).pdf 2011-09-04
44 2638-CHE-2007_Drawings_Granted_344032_13-08-2020.pdf 2020-08-13
45 2638-che-2007-drawings.pdf 2011-09-04
45 2638-CHE-2007_Description_Granted_344032_13-08-2020.pdf 2020-08-13
46 2638-CHE-2007_Claims_Granted_344032_13-08-2020.pdf 2020-08-13
46 2638-che-2007-form 1.pdf 2011-09-04
47 2638-CHE-2007_Abstract_Granted_344032_13-08-2020.pdf 2020-08-13
47 2638-che-2007-form 3.pdf 2011-09-04
48 2638-CHE-2007-PatentCertificate13-08-2020.pdf 2020-08-13
48 2638-che-2007-form 5.pdf 2011-09-04
49 2638-CHE-2007-IntimationOfGrant13-08-2020.pdf 2020-08-13
49 2638-CHE-2007 FORM-18 28-09-2010.pdf 2010-09-28

Search Strategy

1 2638CHE2007_14-05-2018.pdf

ERegister / Renewals

3rd: 11 Sep 2020

From 14/11/2009 - To 14/11/2010

4th: 11 Sep 2020

From 14/11/2010 - To 14/11/2011

5th: 11 Sep 2020

From 14/11/2011 - To 14/11/2012

6th: 11 Sep 2020

From 14/11/2012 - To 14/11/2013

7th: 11 Sep 2020

From 14/11/2013 - To 14/11/2014

8th: 11 Sep 2020

From 14/11/2014 - To 14/11/2015

9th: 11 Sep 2020

From 14/11/2015 - To 14/11/2016

10th: 11 Sep 2020

From 14/11/2016 - To 14/11/2017

11th: 11 Sep 2020

From 14/11/2017 - To 14/11/2018

12th: 11 Sep 2020

From 14/11/2018 - To 14/11/2019

13th: 11 Sep 2020

From 14/11/2019 - To 14/11/2020

14th: 05 Nov 2020

From 14/11/2020 - To 14/11/2021

15th: 03 Nov 2021

From 14/11/2021 - To 14/11/2022

16th: 08 Nov 2022

From 14/11/2022 - To 14/11/2023