Sign In to Follow Application
View All Documents & Correspondence

A System And A Method For Controlling Bionic Arm Using A Combination Of Electroencephalogram And Electromyogram Signals

Abstract: The present invention provides a system for controlling a bionic arm using EEG and EMG data. The system comprises of an EEG and EMG digital data acquisition 101, an EEG and EMG data processor and said bionic arm. The bionic arm includes wireless means 102, a microcontroller 103, plurality of linear actuators and servo motor 104. The data processor configured for processing and extracting features 105 from said both data, classifying said extracted features 105 using machine learning classifiers 106, generating major commands 107 for controlling bionic arm and sending said major commands 107 to the microcontroller 103 through the wireless means 102. The microcontroller 103 converts the major commands 107 into predefined micro commands 108. The micro commands 108 drives and controls said linear actuators and servomotor 104 of the bionic arm and thereby control the bionic arm to perform a required action. A method for controlling said bionic arm using EEG and EMG data is also provided. Fig. 1.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
06 December 2018
Publication Number
24/2020
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
sunita@skslaw.org
Parent Application

Applicants

Amrita Vishwa Vidyapeetham
Amritapuri Campus, Kollam

Inventors

1. Ganesha Udupa
#03- Nachiketa, Kollam, 690525
2. G. Jayachandran Nair
Amrita Guest house, Vallikavu, Kollam, 690525
3. Gayathri Girija
Sreenikethan, Prayar P.O., Ochira, Alpuzha, 690547

Specification

FIELD OF INVENTION
[0001] The embodiment herein generally relates to the field of robotics. More specifically, the invention provides a system and a method for controlling a bionic arm electromechanically. Particularly, the invention provides a control system for controlling a bionic arm using electroencephalogram (EEG) and electromyogram (EMG) signal data. BACKGROUND AND PRIOR ART
[0002] The human hand is capable of a wide range of movements, from lifting heavy objects to the tiniest of tools and having flexibility to grip around the object to hold it securely. The human hand has been mimicked by many robotic hands to provide the same range of movements and flexibility. Robotic hand is used in industries to lift heavy objects, in medical science to provide prosthetic hand or assist in surgery, in situations wherein objects can be lifted from places dangerous or places unreachable to humans, in space applications as well as in nuclear plants. A perfect robotic hand should be compact, light-weight, sturdy and cost-effective and be able to be used in both industrial and commercial applications. However, there are a few hurdles to overcome for a robotic hand to be able to mimic a human hand in all aspects.
[0003] Research in brain controlled interfaces has wide application in medical, navigation, entertainment, education and other fields. Prosthetics are one of the major tools for amputees and persons with impaired motor disabilities. At present

a functional bionic hand costs between 12,000 USD and 50,000USD. Due to this, amputees in developing countries like India find it difficult to afford such devices. [0004] In the prior arts, there are methods developed to control prosthetic limbs using EEG signals. EEG serves as a source of physiological signals for brain computer interface (BCI) systems as it can capture brain activities. Eye blinks, chewing, muscular movements and external electrical pickups are usually considered as artefacts in EEG signals and they have large amplitudes. Though ocular signals in EEG are removed in most brain signal analysis, these signals are also used for control application. Eye-blinks are ocular signals with large amplitude compared to other brain signals.
[0005] EEG-ocular signal are used for navigation and control of wheel chair by Fred Achic.et.al in "Hybrid BCI System to Operate an Electric Wheelchair and a Robotic Arm for Navigation and Manipulation Tasks". Eye movements were used as a control for real time text speller BCI system by Raheel. et.al in "Real time text speller based on eye movement classification using wearable EEG sensors". A method for switching home lighting system using eye-blinks was developed by Rani et.al at a success rate of 85% to detect eye-blinks in "Detection of eye blinks from EEG signals for home lighting system activation". Classification of eye-blinks using artificial neural network (ANN) with time domain EEG was done by Poorna et.al in "Classification of EEG based control using ANN and KNN- A Comparison". However, existing systems may not be efficient to generate coded commands and thereby control the bionic arm to perform a required action. [0006] In the prior arts, there are also methods developed to control prosthetic limbs using EMG signals. EMG stands for electromyography signals, which is the muscle electrical signals. EMG is also considered as myoelectric activity. The

muscle tissues conduct electrical potential in a similar way to the nerves. These electrical potential results in the expansion and contraction of the muscles, which in turn results in the movement. The EMG signals are obtained by measuring the difference of electrical potential between two electrodes.
[0007] Gesture recognition system has been designed with EMG as control system in "A new subtle hand gesture recognition algorithm based on EMG and FSR" by B.Wan in 2017. An improved electromyography system has been designed in "Electromyography and inertial sensor-based gesture detection and control" by A.R Ferixo in 2015. But these systems lack a potential to efficiently control the prosthetic device for daily applications.
[0008] Therefore, there is a need to develop a system and a method for controlling a bionic arm using a combination of EEG and EMG data signals and thereby driving said bionic arm to perform a required action.
OBJECTS OF THE INVENTION
[0009] Some of the objects of the present disclosure are described herein below:
[00010] A main object of the present invention is to provide a system and a
method for controlling a bionic arm using electroencephalogram (EEG) data.
[00011] Another object of the present invention is to provide a system and a
method for controlling a bionic arm using electromyogram (EMG) data to
perform required action by the bionic arm.
[00012] Still another object of the present invention is to provide a system and a
method for controlling a bionic arm using combination of EEG and EMG signals
for proper gesture detection and performing the required action by the bionic arm.

[00013] Yet another object of the present invention is to provide a system for controlling a bionic arm that are capable of converting minor commands to major tasks and thereby driving said bionic arm to perform required action. [00014] The other objects and advantages of the present invention will be apparent from the following description when read in conjunction with the accompanying drawings, which are incorporated for illustration of preferred embodiments of the present invention and are not intended to limit the scope thereof.
SUMMARY OF THE INVENTION
[00015] In view of the foregoing, an embodiment herein provides a system and a method for controlling a bionic arm using electroencephalogram (EEG) and electromyogram (EMG) data. According to an embodiment, said system comprises of an electroencephalogram (EEG) and electromyogram (EMG) digital data acquisition configured to provide the EEG and EMG data, an EEG and EMG data processor and the bionic arm. According to an embodiment, the bionic arm includes a wireless means, a microcomputer, a microcontroller, plurality of linear actuators and a servo motor.
[00016] According to an embodiment, EEG data processor is configured for processing and extracting features from the EEG data and EMG data processor is configured for processing and extracting features from the EMG data. According to an embodiment, processing and extracting features from said EEG and EMG data is performed by independent component analysis. According to an embodiment, the extracted features include spectral energy, prominent spectral amplitudes and frequencies, cumulative width and cumulative number of peaks

for EEG, root mean square, mean absolute value, slope sign change, zero crossing and waveform length.
[00017] The EEG and EMG data processor are further configured for classifying said extracted features using machine learning classifiers and thereby generating major commands for controlling bionic arm. According to an embodiment, machine learning classifiers include linear discriminant analysis (LDA) or multiclass support vector machine (SVM) or K-Nearest Neighbors (KNN). [00018] The EEG and EMG data processor are further configured for sending the major commands to the microcontroller in the bionic arm through the serial interface. According to an embodiment, the major commands are ocular signals in a form of coded eye-blinks, wherein the coded eye-blinks include 2 eye-blinks, 3 eye-blinks and 4-blinks and various gesture codes.
[00019] According to an embodiment, microcontroller is configured for converting the major commands into predefined micro commands. The micro commands drive and controls the linear actuators and servomotor of the bionic arm and thereby controls the bionic arm to perform a required action. According to an embodiment, the minor commands include various grasp patterns like two finger grip, three finger grasp, power grasp, cylindrical grasp, handshake, tripod pinch and hook grasp.
[00020] According to an embodiment, a method for controlling a bionic arm using a system; wherein said method comprising of acquiring electroencephalogram (EEG) and electromyogram (EMG) data from EEG and EMG digital data acquisition respectively, processing and extracting features from said EEG and EMG data by a EEG and EMG data processor respectively, classifying said extracted features using machine learning classifiers by the EEG

and EMG data processor and thereby generating major commands for controlling bionic arm, sending the major commands to a microcontroller in the bionic arm through a serial means by the EEG and EMG data processor, converting the major commands into predefined micro commands by the microcontroller, and driving and controlling linear actuators and servomotor of the bionic arm using the minor commands and thereby controlling the bionic arm to perform a required action. [00021] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications. BRIEF DESCRIPTION OF DRAWINGS
[00022] The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items. [00023] Fig. 1 illustrates a general block diagram of system for controlling a bionic arm using electroencephalogram (EEG) and electromyogram (EMG) data, according to an embodiment herein;
[00024] Fig.2 illustrates block diagram showing the various components of the control system, according to an embodiment herein;

[00025] Fig.3 illustrates a graphical representation showing components of EEG data after independent component analysis (ICA), according to an embodiment herein;
[00026] Fig.4 illustrates a graphical representation showing the preprocessed EMG signal acquired from the sensor, according to an embodiment herein; [00027] Fig.5 illustrates a table showing major commands and pre-defined minor tasks, according to an embodiment herein;
[00028] Fig.6 illustrates various grasp obtained by the bionic arm, according to an embodiment herein;
[00029] Fig.7 illustrates a method for controlling a bionic arm using EEG and EMG data, according to an embodiment herein; and
[00030] Fig.8 illustrates various parts of the bionic arm, according to an embodiment herein.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS [00031] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[00032] As mentioned above, there is a need a system and a method for controlling a bionic arm using electroencephalogram (EEG) and electromyogram

(EMG) data. The embodiments herein achieve this by providing a system that are capable to generate coded commands and thereby driving said bionic arm to perform a required action. Referring now to the drawings, and more particularly to FIGS. 1 through 8, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[00033] Fig.l illustrates a general block diagram 100 of system for controlling a bionic arm using electroencephalogram (EEG) and electromyogram (EMG) data, according to an embodiment. According to an embodiment, said system comprises of an electroencephalogram (EEG) and electromyogram (EMG) digital data acquisition 101 configured to provide the EEG and EMG data, an EEG and EMG data processor and the bionic arm. The EEG data is acquired using 16 channel Epoc+ headset. The head set consist of 14 data channels and 2 reference channels. The electrodes are positioned based on 10-20 montage. The data is acquired at a sampling rate of 128 sample /sec /channel and it is sent to the processor- over secured Wi-Fi. The EMG data is acquired using 8 channel EMG armband. The armband consists of 8 EMG sensors. The EMG data is acquired at a sampling rate of 200 samples/sec/channel and it is sent to the processor over secure low energy Bluetooth. According to an embodiment, the bionic arm includes a wireless means, a microcontroller 103 and plurality of linear actuators and servo motor 104.
[00034] According to an embodiment, EEG data processor is configured for processing and extracting features from the EEG data. The input EEG data is pre-processed by removing DC offset. A band pass filter with a pass-band 4 Hz to 40 Hz is used for filtering the noise. The EEG data is also smoothed using a 7 point

moving average filter. Polynomial fitting may be used for time varying base line
correction.
[00035] According to an embodiment, EMG data processor is configured for
processing and extracting features from the EMG data. The input EMG data is
pre-processed by removing DC offset. A notch filter is used for filtering the noise
at 50 Hz and 60 Hz.
[00036] Fig.2 illustrates a basic functional block diagram of the system. The EEG
and EMG sensors are capable of sensing the EEG and EMG data signals
respectively. The pre-filtering and processing components are capable of filtering
and processing the obtained EEG and EMG data signals.
[00037] Fig.3 illustrates a graphical representation 300 showing components of
EEG data after independent component analysis (ICA), according to an
embodiment here. According to an embodiment, processing and extracting
features from said EEG data is performed by independent component analysis.
Independent component analysis (ICA) is a method used for separation of
multivariate signal into additive sub-components. The preprocessed data is
decomposed into independent components using ICA. Thresholding is applied on
the selected ICA components. Each EEG ICA component is smoothed and peaks
are identified using a time based window. The number of peaks in each window is
identified as the command number and the commands (blinks) are extracted in
from each window. After extracting the commands, corresponding spectrum is
obtained.
[00038] After applying ICA, the required ICA components are selected and
features for classification are obtained. According to an embodiment, the
extracted features 105 include spectral energy, prominent spectral amplitudes and

frequencies, cumulative width and cumulative number of peaks. Regarding spectral energy, total power spectrum energy is obtained for each command window by summing up square of individual amplitudes. This spectral energy is normalized to avoid numerical instability in computation.
[00039] Prominent peaks in log FFT [Fast Fourier transform function is applied on each spectrum and logarithm of the resulting FFT spectra is found (log FFT)] represent important features of classification. They are also related to the command number. The peak values of these spectrums are obtained using the peak finding method. The first two peaks in log FFT and corresponding frequencies are required. The pulse width gives prominent low frequency content in the EEG signal. The cumulative pulse duration and total number of peaks in each window is proportional to the total count of blinks. The above mentioned features are extracted and given to classification algorithm. [00040] The EEG data processor is further configured for classifying said extracted features 105 using machine learning classifiers 106 and thereby generating major commands 107 for controlling bionic arm. According to an embodiment, the machine learning classifiers 106 selected from a group consisting of linear discriminant analysis (LDA), multiclass support vector machine (SVM) and K-Nearest Neighbor (KNN).
[00041] Multiclass SVM assigns labels to instances using SVM. It reduces single multiclass problem into multiple binary class problems. The SVM is a classifier which fits two data sets with a margin between the least square planes with minimum distance from control points. The points, which are closer to the separating plane, are given larger weights compared to the farther one. A fraction of total number points is used as control vectors to fit the hyper plane. Linear

Discriminant Analysis is a supervised learning algorithm where classes are separated using the weighted combination of all features. It is similar to regression analysis of one dimension time series where a set of known values is used to predict newer values. In multi-dimension a separating plane between two or more sets of data, is fitted. The hyper-plane is fitted with least distance of all points from the plane. The distance metric in Cartesian, likely hood function is used. After classification major commands (coded eye-blinks or number of eye blinks) are generated.
[00042] Fig.4 illustrates a graphical representation 400 showing components of EMG data after independent component analysis (ICA), according to an embodiment here. According to an embodiment, processing and extracting features from said EMG data is performed by independent component analysis. Independent component analysis (ICA) is a method used for separation of multivariate signal into additive sub-components. The preprocessed data is decomposed into independent components using ICA. Thresholding is applied on the selected ICA components. Each EMG ICA component is smoothed and peaks are identified using a time based window.
[00043] Fig.5 illustrates a table 500 showing major commands and pre-defined minor tasks. The EEG data processor is further configured for sending the major commands 107 to the microcontroller 103 in the bionic arm through the wireless means 102. According to an embodiment, the major commands 107 are ocular signals in a form of coded eye-blinks, wherein the coded eye-blinks include 2 eye-blinks, 3 eye-blinks and 4-blinks. According to an embodiment, microcontroller 103 is configured for converting the major commands 107 into predefined micro commands 108. The micro commands 108 drives and controls the servomotors

104 of the bionic arm and thereby controls the bionic arm to perform a required action. According to an embodiment, the minor commands 108 include two finger grip, three finger grasp and power grasp. For example, when the major command 107 is 4 eye-blinks the corresponding task (minor command) 108 performed is power grasp.
[00044] Fig.6 illustrates a table 600 showing various grasp patterns which is implemented by the bionic arm. The various grasp patterns are cylindrical grasp, tripod grasp, precession grasp, pinch grasp, power grasp and prismatic grasp. According to an embodiment, cylindrical grasp is used for picking up cylindrical object. Tripod grasp is used to hold pens or small objects. Pinch grasp is used to pick up tiny object. Power grasp is required to pick up heavy object. [00045] Exemplary methods for implementing system for controlling a bionic arm using EEG and EMG data are described with reference to Fig 7. The methods are illustrated as a collection of operations in a logical flow graph representing a sequence of operations that can be implemented in hardware, software, firmware, or a combination thereof. The order in which the methods are described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the methods, or alternate methods. Additionally, individual operations may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. In the context of software, the operations represent computer instructions that, when executed by one or more processors, perform the recited operations. [00046] Fig.7 illustrates a method 700 for controlling a bionic arm using EEG and EMG data, according to an embodiment. A method for controlling a bionic arm using a system; wherein said method comprising of:

[00047] at block 701, acquiring electroencephalogram (EEG) and
electromyogram (EMG) data from a EEG and EMG digital data acquisition 101,
[00048] at block 702, processing and extracting features 105 from said EEG and
EMG data by the data processor,
[00049] at block 703, classifying said extracted features 105 using machine
learning classifiers 106 by the data processor and thereby generating major
commands 107 for controlling bionic arm,
[00050] at block 704, sending the major commands 107 to a microcontroller 103
in the bionic arm through a wireless means 102 by the data processor,
[00051] at block 705, converting the major commands 107 into predefined micro
commands 108 by the microcontroller, and
[00052] at block 706, driving and controlling servomotors 104 of the bionic arm
using the minor commands 108 and thereby controlling the bionic arm to perform
a required action.
[00053] Fig.8 illustrates various components 800 of the said bionic arm,
according to an embodiment. The human hand 801 into which the whole bionic
arm is fixed. The forearm harness 802 helps to keep the arm fixed to amputees
hand. The thumb 803 of the bionic arm is actuated with the servomotor. The linear
actuator 804 is provided for controlling the individual fingers. The top phalange
805 of the finger consist of sensors. The force sensor 806 is used for the closed
loop control of the bionic arm. The silicon fingertip 807 is provided for attaining
the naturality or natural look of the bionic arm. The bionic arm further includes a
control unit 808.

[00054] A main advantage of the present invention is that provided system and
method uses combination of electroencephalogram (EEG) and electromyogram
(EMG) data for controlling a bionic arm.
[00055] Another advantage of the present invention is that provided system and
method uses combination of electroencephalogram (EEG) and electromyogram
(EMG) data for controlling a bionic arm and thereby drives the bionic arm to
perform required action.
[00056] Still another advantage of the present invention is that provided system
and method uses ocular signals in a form of coded eye-blinks from EEG and
various EMG signals for different gestures for controlling a bionic arm.
[00057] Yet another advantage of the present invention is that provided system
for controlling a bionic arm are capable of converting major commands to minor
tasks.
[00058] Another advantage of the present invention is that provided system is
highly efficient and accurate.
[00059] The foregoing description of the specific embodiments will so fully
reveal the general nature of the embodiments herein that others can, by applying
current knowledge, readily modify and/or adapt for various applications such
specific embodiments without departing from the generic concept, and, therefore,
such adaptations and modifications should and are intended to be comprehended
within the meaning and range of equivalents of the disclosed embodiments. It is to
be understood that the phraseology or terminology employed herein is for the
purpose of description and not of limitation. Therefore, while the embodiments
herein have been described in terms of preferred embodiments, those skilled in the

art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.

/e Claim:
1. A system for controlling a bionic arm using electroencephalogram (EEG) and electromyogram (EMG) data; wherein said system comprises of: an electroencephalogram (EEG) and electromyogram (EMG) digital data acquisition 101 configured to provide said EEG and EMG data; characterized in that
wherein said system further includes an EEG and EMG data processor and said bionic arm;
wherein said bionic arm includes a wireless means 102, a microcontroller 103, plurality of linear actuators and servo motor 104; wherein said EEG and EMG data processor configured for processing and extracting features 105 from said EEG and EMG data; wherein said EEG and EMG data processor further configured for classifying said extracted features 105 using machine learning classifiers 106 and thereby generating major commands 107 for controlling bionic arm;
wherein said EEG and EMG data processor further configured for sending said major commands 107 to the microcontroller 103 in the bionic arm through the wireless means 102;
wherein said microcontroller 103 configured for converting the major commands 107 into predefined micro commands 108; and wherein said micro commands 108 drives and controls said servomotors 104 of the bionic arm and thereby controls the bionic arm to perform a required action.

2. The system as claimed in claim 1, wherein said processing and extracting features 105 from said EEG and EMG data performed by independent component analysis.
3. The system as claimed in claim 2, wherein said extracted features 105 includes spectral energy, prominent spectral amplitudes and frequencies, cumulative width, cumulative number of peaks, root mean square value, mean absolute value, slope sign change, zero crossing and waveform length.
4. The system as claimed in claim 3, wherein said machine learning classifiers 106 selected from a group consisting of linear discriminant analysis (LDA), multiclass support vector machine (SVM) and K-Nearest Neighbor (KNN).
5. The system as claimed in claim 4, wherein said major commands 107 are ocular signals in a form of coded eye-blinks and signals for various gestures, wherein said coded eye-blinks includes 2 eye-blinks, 3 eye-blinks and 4-blinks.
6. The system as claimed in claim 1, wherein said wireless 102 means in the bionic arm includes Bluetooth or RFID or Wi-Fi.
7. The system as claimed in claim 1, wherein said minor commands 108 includes two finger grip, three finger grasp, cylindrical grasp, lateral grasp, tripod grasp, handshake and power grasp.
8. A method for controlling a bionic arm using a system; wherein said method comprising of:
acquiring electroencephalogram (EEG) and electromyogram (EMG) data from a EEG and EMG digital data acquisition 101, wherein said system

includes the EEG and EMG digital data acquisition 101, an EEG and
EMG data processor and said bionic arm;
processing and extracting features 105 from said EEG and EMG data by
the EEG and EMG data processor;
classifying said extracted features 105 using machine learning classifiers
106 by the EEG and EMG data processor and thereby generating major
commands 107 for controlling bionic arm;
sending said major commands 107 to a microcontroller 103 in the bionic
arm through a wireless means 102 by the EEG and EMG data processor,
wherein said bionic arm includes said microcontroller 103, said wireless
means 102, plurality of linear actuators and servomotor 104;
converting the major commands 107 into predefined micro commands 108
by the microcontroller 103; and
driving and controlling said linear actuators and servomotor 104 of the
bionic arm using said minor commands 108 and thereby controlling the
bionic arm to perform a required action.

Documents

Application Documents

# Name Date
1 201841046127-STATEMENT OF UNDERTAKING (FORM 3) [06-12-2018(online)].pdf 2018-12-06
2 201841046127-POWER OF AUTHORITY [06-12-2018(online)].pdf 2018-12-06
3 201841046127-FORM 1 [06-12-2018(online)].pdf 2018-12-06
4 201841046127-DRAWINGS [06-12-2018(online)].pdf 2018-12-06
5 201841046127-COMPLETE SPECIFICATION [06-12-2018(online)].pdf 2018-12-06
6 201841046127-Proof of Right (MANDATORY) [21-01-2019(online)].pdf 2019-01-21
7 201841046127-FORM-26 [21-01-2019(online)].pdf 2019-01-21
8 201841046127-ENDORSEMENT BY INVENTORS [21-01-2019(online)].pdf 2019-01-21
9 Correspondence by Agent_Form-1, Fom-5, POA_24-01-2019.pdf 2019-01-24
10 201841046127-FORM 18 [20-01-2021(online)].pdf 2021-01-20
11 201841046127-FORM-26 [15-12-2021(online)].pdf 2021-12-15
12 201841046127-POA [01-01-2022(online)].pdf 2022-01-01
13 201841046127-MARKED COPIES OF AMENDEMENTS [01-01-2022(online)].pdf 2022-01-01
14 201841046127-FORM 13 [01-01-2022(online)].pdf 2022-01-01
15 201841046127-AMENDED DOCUMENTS [01-01-2022(online)].pdf 2022-01-01
16 201841046127-FER.pdf 2022-01-05
17 201841046127-EDUCATIONAL INSTITUTION(S) [05-01-2022(online)].pdf 2022-01-05
18 201841046127-FORM-26 [30-06-2022(online)].pdf 2022-06-30
19 201841046127-Proof of Right [04-07-2022(online)].pdf 2022-07-04
20 201841046127-MARKED COPIES OF AMENDEMENTS [04-07-2022(online)].pdf 2022-07-04
21 201841046127-FORM 13 [04-07-2022(online)].pdf 2022-07-04
22 201841046127-AMMENDED DOCUMENTS [04-07-2022(online)].pdf 2022-07-04
23 201841046127-FER_SER_REPLY [05-07-2022(online)].pdf 2022-07-05
24 201841046127-COMPLETE SPECIFICATION [05-07-2022(online)].pdf 2022-07-05
25 201841046127-CLAIMS [05-07-2022(online)].pdf 2022-07-05
26 201841046127-US(14)-HearingNotice-(HearingDate-03-06-2024).pdf 2024-05-03
27 201841046127-Response to office action [31-05-2024(online)].pdf 2024-05-31
28 201841046127-Correspondence to notify the Controller [01-06-2024(online)].pdf 2024-06-01
29 201841046127-Response to office action [18-06-2024(online)].pdf 2024-06-18
30 201841046127-MARKED COPY [18-06-2024(online)].pdf 2024-06-18
31 201841046127-MARKED COPIES OF AMENDEMENTS [18-06-2024(online)].pdf 2024-06-18
32 201841046127-FORM 13 [18-06-2024(online)].pdf 2024-06-18
33 201841046127-CORRECTED PAGES [18-06-2024(online)].pdf 2024-06-18
34 201841046127-AMMENDED DOCUMENTS [18-06-2024(online)].pdf 2024-06-18
35 201841046127-FORM-8 [19-06-2024(online)].pdf 2024-06-19
36 201841046127-US(14)-ExtendedHearingNotice-(HearingDate-03-02-2025)-1600.pdf 2025-01-30
37 201841046127-Proof of Right [03-02-2025(online)].pdf 2025-02-03
38 201841046127-Written submissions and relevant documents [07-02-2025(online)].pdf 2025-02-07
39 201841046127-Proof of Right [07-02-2025(online)].pdf 2025-02-07
40 201841046127-Annexure [07-02-2025(online)].pdf 2025-02-07
41 201841046127-FORM-24 [19-03-2025(online)].pdf 2025-03-19
42 201841046127-ReviewPetition-HearingNotice-(HearingDate-07-04-2025).pdf 2025-03-21
43 201841046127-Correspondence to notify the Controller [21-03-2025(online)].pdf 2025-03-21
44 201841046127-Written submissions and relevant documents [22-04-2025(online)].pdf 2025-04-22
45 201841046127-FORM-8 [22-04-2025(online)].pdf 2025-04-22
46 201841046127-Annexure [22-04-2025(online)].pdf 2025-04-22

Search Strategy

1 SearchHistory201841046127E_21-12-2021.pdf