Sign In to Follow Application
View All Documents & Correspondence

Relevance Score Assignment For Artificial Neural Network

Abstract: The task of relevance score assignment to a set of items onto which an artificial neural network is applied is obtained by redistributing an initial relevance score derived from the network output onto the set of items by reversely propagating the initial relevance score through the artificial neural network so as to obtain a relevance score for each item. In particular this reverse propagation is applicable to a broader set of artificial neural networks and/or at lower computational efforts by performing same in a manner so that for each neuron preliminarily redistributed relevance scores of a set of downstream neighbor neurons of the respective neuron are distributed on a set of upstream neighbor neurons of the respective neuron according to a distribution function.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 September 2017
Publication Number
39/2017
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2024-03-22
Renewal Date

Applicants

FRAUNHOFERGESELLSCHAFT ZUR FÖRDERUNG DER ANGEWANDTEN FORSCHUNG E.V.
Hansastrasse 27c 80686 München
TECHNISCHE UNIVERSITÄT BERLIN
Strasse des 17. Juni 135 10623 Berlin

Inventors

1. BACH, Sebastian
Klarenbachstr. 8 10553 Berlin
2. SAMEK Wojciech
Mühlenstr. 66 A 12249 Berlin
3. MÜLLER Klaus Robert
Arno Holz Str. 12 F 12165 Berlin
4. BINDER Alexander
Novalisstr. 16 10115 Berlin
5. MONTAVON Grégoire
Schöneberger Str. 9 10963 Berlin

Specification

WHAT IS CLAIMED IS:
1. Apparatus for assigning a relevance score to a set of items, the
relevance score indicating a relevance with respect to an application of an
artificial neural network (10) composed of neurons (12) onto the set (16) of
items (42) so as to map the set (16) of items (42) onto a network output (18),
the apparatus being configured to:
redistribute an initial relevance score (R) derived from the network output (18) onto the set (16) of items (42) by reversely propagating the initial relevance score through the artificial neural network (10) so as to obtain a relevance score for each item,
wherein the apparatus is configured to perform the reverse propagation in a manner so that for each neuron, preliminarily redistributed relevance scores of a set of downstream neighbor neurons of the respective neuron are distributed to a set of upstream neighbor neurons of the respective neuron using a distribution function.
2. Apparatus according to claim 1 wherein the apparatus is configured such that the distribution function has a relevance conservation property.
3. Apparatus according to claim 1 or 2, wherein the apparatus is configured to perform the reverse propagation with equally using one distribution function for all neurons of the artificial neural network.
4. Apparatus according to any of claims 1 to 3, wherein the apparatus is configured such that the distribution function is a function of

weights of the artificial neural network, determining a degree of influence of the respective neuron by the set of upstream neighbor neurons of the respective neuron,
neuron activations of the set of upstream neighbor neurons as manifesting themselves upon the application of the artificial neural network (10) onto the set (16) of items (42), and
a sum of preliminarily redistributed relevance scores of the set of downstream neighbor neurons of the respective neuron.
5. Apparatus according to any of claims 1 to 4 wherein the apparatus is configured such that, for each neuron j, the distribution function yielding how much relevance is redistributed as a relevance message Ry from the respective neuron j to the upstream neighbor neuron / is
Rtj = q(i) ■ m( {RJIC, k is downstream neighbor neuron o/")}) where m( RK) with K being the number of downstream neighbors of the respective neuron j is a monotonically increasing function for all its components and yields the preliminarily redistributed relevance score Rj = m( [Rjk.k is downstream neuron of j]) of the respective neuron j and q(i) is a function depending on weights wti connecting the upstream neighbor neuron i to the respective neuron 3, an activation xt of upstream neighbor neuron / of the respective neuron j as resulting from the application of the artificial neural network (10) onto the set (16) of items (42), and a possibly zero-valued bias term bj of neuron/
6 Apparatus according to claim 5 wherein
m( {Rjic.k is downstream neuron o/y}) = £fc Rjk.

7. Apparatus according to claim 5 or 6 wherein the apparatus is
configured such that the function q(f) is a function p of weighted
activations ztj = s(Xi,wij,bj) which are computed by a function s, so
that q(i) = p({Zy|i is upstream neighbor neuron of j}).
8. Apparatus according to claim 7 wherein function s is selected such
that the weighted activation ztJ is given as
zlj = xi wij >
Or zu = xt wij + -j-
with / being the number of upstream neighbor neurons /' of neuron/
9. Apparatus according to any of claims 5 to 8 wherein the apparatus is
configured such that the function q(i) satisfies, for each neuron; for
which Rj > 0, an ordering property,
the ordering property being satisfied if
a) if Zi ztj > 0 , then for all /'1 and i2 being upstream neighbor neurons
of neuron j for which
it holds true that q{i^) < q(i2)
b) OR for all /'1 and i2 being upstream neighbor neurons of neuron j
for which
Z; ,■ > 0 and z,- ,■ > 0 and z,- ,■ < z,- ,■
LlJ L2J L1J L2J
Then it holds true that 0 < q^) < q(i2).
10. Apparatus according to any of claims 5 to 8 wherein the apparatus is
configured such that the function q(i) satisfies an ordering property,

the ordering property being satisfied if for all /'1 and h being upstream
neighbor neurons of neuron j for which
it holds true that l<7C^i)I ^ \R(h)\ f°r a function #(■) that has its minimum at zero, and that is monotonically decreasing on the interval (-00,0) and monotonically increasing on the interval (0,+00).
11. Apparatus according to claim 10 wherein the apparatus is configured
such that the function g{.) is given as follows:
g(z) = araax(0,z) - /? min(0,z) with a > 0,/? > 0.
12. Apparatus according to any of claims 5 to 11, wherein the apparatus is configured such that the function q(i) inherits or is proportional to the Taylor decomposition of a neural network function of the neurons.
13. Apparatus according to any of claims 5 to 11, wherein the apparatus is configured such that the relevance message Rtj is proportional to the Taylor decomposition of a function that is learnt from data and that maps activations xt of upstream neighbors /' of a neuron j to the value m( {Rjk,k is downstream neuron of j}) up to an approximation error.
14. Apparatus according to any of claims 1 to 13, wherein the apparatus is configured such that the distribution function is

where n is the number of upstream neighbor neurons of the respective neuron j, Ry is the relevance message redistributed from the respective neuron j to the upstream neighbor neuron / and Rjk is the relevance message redistributed from the downstream neighbor neuron k to the respective neuron j, %i is the activation of upstream neighbor neuron /' during the application of the neural network onto the set (16) of items (42), wij is the weight connecting the upstream neighbor neuron i to the respective neuron 3, wrj is also a weight connecting the upstream neighbor neuron r to the respective neuron 3, and bj is a bias term of the respective neuron 3, and h() is a scalar function, and where m( RK) with K being the number of downstream neighbors of the respective neuron j is a monotonically increasing function for all its components and yields the preliminarily redistributed relevance score Rj = m({Rjk,k is downstream neuron of _/}) of the respective neuron/
15. Apparatus according to any of claims 1 to 13 wherein the apparatus is configured such that the distribution onto the set of upstream neighbor neurons / of the respective neuron j is performed using a distribution function, wherein the distribution function is
/ / bi \ / bi \
Where (z)+ = max(0,z),(z)_ = min(0,z) , n is the number of upstream neighbor neurons of the respective neuron, Ry is the relevance message redistributed from the respective neuron j to the upstream neighbor neuron /' and Rjk is the relevance message redistributed from the downstream neighbor neuron k to the respective neuron j, %i is the activation of upstream neighbor neuron / during the application of the neural network onto the set (16) of items (42), wij is the weight connecting the upstream neighbor neuron i to the respective neuron 3, wrj is also a weight connecting the upstream
neighbor neuron r to the respective neuron 3 and bj is a bias term of the respective neuron 3, and h() is a scalar function, and a > 0,p > 0, a - p = 1 and m( RK) with K being the number of downstream neighbors of the respective neuron j is a monotonically increasing function for all its components and yields the preliminarily redistributed relevance score Rj = m({Rjk,k is downstream neuron of _/}) of the respective neuron/
16. Apparatus according to claim 14 or 15 wherein m( {Rjic.k is downstream neuron o/y}) = £fc Rjk.
17. Apparatus according to any of claims 14 to 16 wherein h() is a stabilizing function W) =t + £- sign(t).

18. Apparatus according to claims 1 to 17 wherein the apparatus is configured to compute, for each item i, the relevance scores Ri of the respective item i by summing up the relevance messages of neurons having the respective item as upstream neighbor neuron, redistributed to the respective item.
19. Apparatus according to claims 1 to 18 wherein the artificial neural network is directly applied onto the set of items so that the items of the set (16) of items (42) form upstream neighbors for a subset of the artificial neurons of the artificial neural network, and the network output corresponds to a neuron activation of a neuron at a downstream end of the artificial neural network.

19. Apparatus according to any of the previous claims wherein the network output (18) is a scalar with the initial relevance score derived therefrom equaling, or being derived by applying a monotonically increasing function onto, a value of the scalar, or the network output is a vector with the initial relevance value equaling, or being derived by applying a monotonically increasing function onto, a value of one or more components of the vector.
20. Apparatus according to any of claims 1 to 19 wherein the apparatus is configured to perform the reverse propagation so that 0.95-R< f(IRi) < 1.05-R with IRj denoting a sum over the relevance scores of all items i of the set (16) of items (42) and f being a monotonic function solely depending on IRj.

21. Apparatus according to claim 20 wherein the apparatus is configured to such that f is an identity function.
22. Apparatus according to any of claims 1 to 21 wherein the apparatus is configured such that for each neuron, a sum of relevance message values distributed to the set of upstream neighbor neurons of the respective neuron by the distribution function equals £(SN) or deviates therefrom by no more than 5% with SN denoting the sum of relevance messages from the set of downstream neighbor neurons of the respective neuron to the respective neuron and % denoting a monotonic function solely depending on SN-
23. Apparatus according to claim 22 wherein the apparatus is configured such that % is an identity function.
24. Apparatus according to any of claims 1 to 23 wherein the artificial neural network is layered so that each neuron (12) belongs to one of a sequence of layers and the apparatus is configured to perform the reverse propagation with equally using one distribution function for all neurons of the artificial neural network.
25. Apparatus according to any of claims 1 to 23 wherein the artificial neural network is layered so that each neuron (12) belongs to one of a sequence of layers and the apparatus is configured to perform the reverse propagation so that, for each layer, a sum of relevance message values distributed to the neurons of the respective layer equals £(S|_) or deviates therefrom by no more than 5% with SL denoting the sum of preliminarily redistributed relevance scores of the neurons of

a layer downstream to the respective layer and Z, denoting a monotonic function solely depending on S|_.
26. Apparatus according to any of the previous claims wherein the set (16)
of items is, or is a combination of,
a picture with each of the items (42) of the set (16) of items (42) corresponding to one or more of the pixels or subpixels of the picture, and/or
a video with each of the items (42) of the set (16) of items (42) corresponding to one or more pixels or subpixels of pictures of the video, pictures of the video or picture sequences of the video, and/or
an audio signal with each item (42) of the set (16) of items (42) corresponding to one or more audio samples of the audio signal, and/or
a feature map of local features or a transform locally or globally extracted from a picture, video or audio signal with the items (42) of the set (16) of items (42) corresponding to local features, and/or
a text with the items (42) of the set (16) of items (42) corresponding to words, sentences or paragraphs of the text, and/or
a graph such as a social network relations graph with the items (42) of the set (16) of items (42) corresponding to nodes or edges or sets of nodes or a set of edges or subgraphs.
27. System (100) for data processing, comprising
an apparatus (50) for assigning a relevance score to a set of items according to any of the previous claims, and
an apparatus (102) for processing of the set (16) of items or data to be processed (106) and derived from the set of items with adapting the processing depending on the relevance scores.

28. System according to claim 27, wherein the processing is a lossy processing and the apparatus for processing is configured to decrease a lossiness of the lossy processing for items having higher relevance scores assigned therewith than compared to items having lower relevance scores assigned therewith.
29. System according to claim 27, wherein the processing is a visualizing wherein the apparatus for adapting is configured to perform a highlighting in the visualization depending on the relevance scores.
30. System according to claim 27, wherein the processing is a data replenishment by reading from memory or performing a further measurement wherein the apparatus (102) for processing is configured to focus the data replenishment depending on the relevance scores.
31. System (110) for highlighting a region of interest, comprising
an apparatus (50) for assigning a relevance score to a set of items according to any of claims 1 to 25, and
an apparatus (112) for generating a relevance graph (114) depending on the relevance scores.
32. System (120) for optimizing a neural network, comprising
an apparatus (50) for assigning a relevance score to a set of items according to any of claims 1 to 26;
an apparatus (122) for applying the apparatus for assigning onto a plurality of different sets of items; and
an apparatus (124) for detecting a portion of increased relevance (128) within the neural network by accumulating relevances assigned to the neurons of the network during the application of the

apparatus for assigning onto the plurality of different sets of items, and optimizing the artificial neural network depending on the portion of increased relevance.
33. Method for assigning a relevance score to a set of items, the relevance
score indicating a relevance with respect to an application of an artificial
neural network (10) composed of neurons (12) onto the set (16) of
items (42) so as to map the set (16) of items (42) onto a network output
(18), the apparatus being configured to
redistributing an initial relevance score (R) derived from the network output (18) onto the set (16) of items (42) by reversely propagating the initial relevance score through the artificial neural network (10) so as to obtain a relevance score for each item,
wherein the reverse propagation is performed in a manner so that for each neuron, preliminarily redistributed relevance scores of a set of downstream neighbor neurons of the respective neuron are distributed to a set of upstream neighbor neurons of the respective neuron using a distribution function.
34. Computer program having a program code for performing, when
running on a computer, a method according to claim 33.

Documents

Orders

Section Controller Decision Date
15 and 43(1) Shraddha Turkar 2024-03-22
15 and 43(1) Shraddha Turkar 2024-03-22

Application Documents

# Name Date
1 201747033019-IntimationOfGrant22-03-2024.pdf 2024-03-22
1 201747033019-STATEMENT OF UNDERTAKING (FORM 3) [18-09-2017(online)].pdf 2017-09-18
2 201747033019-PatentCertificate22-03-2024.pdf 2024-03-22
2 201747033019-REQUEST FOR EXAMINATION (FORM-18) [18-09-2017(online)].pdf 2017-09-18
3 201747033019-Written submissions and relevant documents [23-02-2024(online)].pdf 2024-02-23
3 201747033019-FORM 18 [18-09-2017(online)].pdf 2017-09-18
4 201747033019-FORM 1 [18-09-2017(online)].pdf 2017-09-18
4 201747033019-Correspondence to notify the Controller [08-02-2024(online)].pdf 2024-02-08
5 201747033019-FORM-26 [08-02-2024(online)].pdf 2024-02-08
5 201747033019-DRAWINGS [18-09-2017(online)].pdf 2017-09-18
6 201747033019-Information under section 8(2) [18-01-2024(online)].pdf 2024-01-18
6 201747033019-DECLARATION OF INVENTORSHIP (FORM 5) [18-09-2017(online)].pdf 2017-09-18
7 201747033019-US(14)-HearingNotice-(HearingDate-09-02-2024).pdf 2024-01-10
7 201747033019-COMPLETE SPECIFICATION [18-09-2017(online)].pdf 2017-09-18
8 201747033019.pdf 2017-09-21
8 201747033019-FORM 3 [08-11-2023(online)].pdf 2023-11-08
9 201747033019-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [02-11-2023(online)].pdf 2023-11-02
9 abstract 201747033019.jpg 2017-09-22
10 201747033019-FORM 3 [16-02-2018(online)].pdf 2018-02-16
10 201747033019-US(14)-HearingNotice-(HearingDate-03-11-2023).pdf 2023-10-25
11 201747033019-FORM 3 [09-10-2023(online)].pdf 2023-10-09
11 201747033019-Proof of Right (MANDATORY) [04-04-2018(online)].pdf 2018-04-04
12 201747033019-FORM-26 [04-04-2018(online)]_52.pdf 2018-04-04
12 201747033019-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [13-08-2023(online)].pdf 2023-08-13
13 201747033019-FORM-26 [04-04-2018(online)].pdf 2018-04-04
13 201747033019-US(14)-HearingNotice-(HearingDate-14-08-2023).pdf 2023-07-12
14 201747033019-FORM 3 [18-04-2023(online)].pdf 2023-04-18
14 Correspondence by Agent_Form26_16-04-2018.pdf 2018-04-16
15 201747033019-FORM 3 [17-08-2018(online)].pdf 2018-08-17
15 201747033019-FORM 3 [17-10-2022(online)].pdf 2022-10-17
16 201747033019-FORM 3 [01-04-2022(online)].pdf 2022-04-01
16 201747033019-FORM 3 [21-02-2019(online)].pdf 2019-02-21
17 201747033019-Information under section 8(2) [09-03-2022(online)].pdf 2022-03-09
17 201747033019-FORM 3 [15-04-2019(online)].pdf 2019-04-15
18 201747033019-Information under section 8(2) (MANDATORY) [04-07-2019(online)].pdf 2019-07-04
18 201747033019-Information under section 8(2) [21-01-2022(online)].pdf 2022-01-21
19 201747033019-FORM 3 [04-07-2019(online)].pdf 2019-07-04
19 201747033019-FORM 3 [06-10-2021(online)].pdf 2021-10-06
20 201747033019-FORM 3 [19-08-2019(online)].pdf 2019-08-19
20 201747033019-Information under section 8(2) [21-05-2021(online)].pdf 2021-05-21
21 201747033019-FORM 3 [01-10-2019(online)].pdf 2019-10-01
21 201747033019-FORM 3 [02-04-2021(online)].pdf 2021-04-02
22 201747033019-2. Marked Copy under Rule 14(2) [14-01-2021(online)].pdf 2021-01-14
22 201747033019-Information under section 8(2) (MANDATORY) [14-11-2019(online)].pdf 2019-11-14
23 201747033019-AMMENDED DOCUMENTS [14-01-2021(online)].pdf 2021-01-14
23 201747033019-Information under section 8(2) (MANDATORY) [23-01-2020(online)].pdf 2020-01-23
24 201747033019-FORM 3 [19-02-2020(online)].pdf 2020-02-19
24 201747033019-FORM 13 [14-01-2021(online)].pdf 2021-01-14
25 201747033019-Information under section 8(2) [02-07-2020(online)].pdf 2020-07-02
25 201747033019-MARKED COPIES OF AMENDEMENTS [14-01-2021(online)].pdf 2021-01-14
26 201747033019-FER.pdf 2020-07-14
26 201747033019-Retyped Pages under Rule 14(1) [14-01-2021(online)].pdf 2021-01-14
27 201747033019-ABSTRACT [13-01-2021(online)].pdf 2021-01-13
27 201747033019-Information under section 8(2) [13-10-2020(online)].pdf 2020-10-13
28 201747033019-CLAIMS [13-01-2021(online)].pdf 2021-01-13
28 201747033019-FORM 3 [13-10-2020(online)].pdf 2020-10-13
29 201747033019-COMPLETE SPECIFICATION [13-01-2021(online)].pdf 2021-01-13
29 201747033019-FER_SER_REPLY [13-01-2021(online)].pdf 2021-01-13
30 201747033019-DRAWING [13-01-2021(online)].pdf 2021-01-13
31 201747033019-COMPLETE SPECIFICATION [13-01-2021(online)].pdf 2021-01-13
31 201747033019-FER_SER_REPLY [13-01-2021(online)].pdf 2021-01-13
32 201747033019-CLAIMS [13-01-2021(online)].pdf 2021-01-13
32 201747033019-FORM 3 [13-10-2020(online)].pdf 2020-10-13
33 201747033019-ABSTRACT [13-01-2021(online)].pdf 2021-01-13
33 201747033019-Information under section 8(2) [13-10-2020(online)].pdf 2020-10-13
34 201747033019-FER.pdf 2020-07-14
34 201747033019-Retyped Pages under Rule 14(1) [14-01-2021(online)].pdf 2021-01-14
35 201747033019-Information under section 8(2) [02-07-2020(online)].pdf 2020-07-02
35 201747033019-MARKED COPIES OF AMENDEMENTS [14-01-2021(online)].pdf 2021-01-14
36 201747033019-FORM 3 [19-02-2020(online)].pdf 2020-02-19
36 201747033019-FORM 13 [14-01-2021(online)].pdf 2021-01-14
37 201747033019-Information under section 8(2) (MANDATORY) [23-01-2020(online)].pdf 2020-01-23
37 201747033019-AMMENDED DOCUMENTS [14-01-2021(online)].pdf 2021-01-14
38 201747033019-2. Marked Copy under Rule 14(2) [14-01-2021(online)].pdf 2021-01-14
38 201747033019-Information under section 8(2) (MANDATORY) [14-11-2019(online)].pdf 2019-11-14
39 201747033019-FORM 3 [01-10-2019(online)].pdf 2019-10-01
39 201747033019-FORM 3 [02-04-2021(online)].pdf 2021-04-02
40 201747033019-FORM 3 [19-08-2019(online)].pdf 2019-08-19
40 201747033019-Information under section 8(2) [21-05-2021(online)].pdf 2021-05-21
41 201747033019-FORM 3 [04-07-2019(online)].pdf 2019-07-04
41 201747033019-FORM 3 [06-10-2021(online)].pdf 2021-10-06
42 201747033019-Information under section 8(2) (MANDATORY) [04-07-2019(online)].pdf 2019-07-04
42 201747033019-Information under section 8(2) [21-01-2022(online)].pdf 2022-01-21
43 201747033019-FORM 3 [15-04-2019(online)].pdf 2019-04-15
43 201747033019-Information under section 8(2) [09-03-2022(online)].pdf 2022-03-09
44 201747033019-FORM 3 [01-04-2022(online)].pdf 2022-04-01
44 201747033019-FORM 3 [21-02-2019(online)].pdf 2019-02-21
45 201747033019-FORM 3 [17-10-2022(online)].pdf 2022-10-17
45 201747033019-FORM 3 [17-08-2018(online)].pdf 2018-08-17
46 201747033019-FORM 3 [18-04-2023(online)].pdf 2023-04-18
46 Correspondence by Agent_Form26_16-04-2018.pdf 2018-04-16
47 201747033019-FORM-26 [04-04-2018(online)].pdf 2018-04-04
47 201747033019-US(14)-HearingNotice-(HearingDate-14-08-2023).pdf 2023-07-12
48 201747033019-FORM-26 [04-04-2018(online)]_52.pdf 2018-04-04
48 201747033019-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [13-08-2023(online)].pdf 2023-08-13
49 201747033019-FORM 3 [09-10-2023(online)].pdf 2023-10-09
49 201747033019-Proof of Right (MANDATORY) [04-04-2018(online)].pdf 2018-04-04
50 201747033019-FORM 3 [16-02-2018(online)].pdf 2018-02-16
50 201747033019-US(14)-HearingNotice-(HearingDate-03-11-2023).pdf 2023-10-25
51 201747033019-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [02-11-2023(online)].pdf 2023-11-02
51 abstract 201747033019.jpg 2017-09-22
52 201747033019-FORM 3 [08-11-2023(online)].pdf 2023-11-08
52 201747033019.pdf 2017-09-21
53 201747033019-COMPLETE SPECIFICATION [18-09-2017(online)].pdf 2017-09-18
53 201747033019-US(14)-HearingNotice-(HearingDate-09-02-2024).pdf 2024-01-10
54 201747033019-Information under section 8(2) [18-01-2024(online)].pdf 2024-01-18
54 201747033019-DECLARATION OF INVENTORSHIP (FORM 5) [18-09-2017(online)].pdf 2017-09-18
55 201747033019-FORM-26 [08-02-2024(online)].pdf 2024-02-08
55 201747033019-DRAWINGS [18-09-2017(online)].pdf 2017-09-18
56 201747033019-FORM 1 [18-09-2017(online)].pdf 2017-09-18
56 201747033019-Correspondence to notify the Controller [08-02-2024(online)].pdf 2024-02-08
57 201747033019-Written submissions and relevant documents [23-02-2024(online)].pdf 2024-02-23
57 201747033019-FORM 18 [18-09-2017(online)].pdf 2017-09-18
58 201747033019-PatentCertificate22-03-2024.pdf 2024-03-22
58 201747033019-REQUEST FOR EXAMINATION (FORM-18) [18-09-2017(online)].pdf 2017-09-18
59 201747033019-IntimationOfGrant22-03-2024.pdf 2024-03-22
59 201747033019-STATEMENT OF UNDERTAKING (FORM 3) [18-09-2017(online)].pdf 2017-09-18

Search Strategy

1 AMENDED201747033019AE_18-08-2021.pdf
1 Search201747033019E_14-07-2020.pdf
2 AMENDED201747033019AE_18-08-2021.pdf
2 Search201747033019E_14-07-2020.pdf

ERegister / Renewals

3rd: 11 Jun 2024

From 20/03/2017 - To 20/03/2018

4th: 11 Jun 2024

From 20/03/2018 - To 20/03/2019

5th: 11 Jun 2024

From 20/03/2019 - To 20/03/2020

6th: 11 Jun 2024

From 20/03/2020 - To 20/03/2021

7th: 11 Jun 2024

From 20/03/2021 - To 20/03/2022

8th: 11 Jun 2024

From 20/03/2022 - To 20/03/2023

9th: 11 Jun 2024

From 20/03/2023 - To 20/03/2024

10th: 11 Jun 2024

From 20/03/2024 - To 20/03/2025

11th: 25 Feb 2025

From 20/03/2025 - To 20/03/2026