Sign In to Follow Application
View All Documents & Correspondence

System And Method For Deep Network Based Glaucoma Prediction

Abstract: A deep-network based system and method to detect, segment and classify the optic disc and optic cup region in a retinal fundus image for glaucoma prediction based on the values of the vertical CDR, horizontal CDR, inferior thickness, superior thickness, nasal thickness, and temporal thickness. The deep network 100 comprises of a feature encoder 101 to process the input fundus image to generate output feature maps; a region of interest (ROI) or proposals extractor 102 receiving the generated feature maps as input and a region of interest (ROI) processor 103 consisting of three prediction heads: a ROI classification head; a ROI co-ordinate prediction head and a ROI mask prediction head.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 September 2019
Publication Number
31/2020
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
kraji@artelus.com
Parent Application
Patent Number
Legal Status
Grant Date
2022-03-29
Renewal Date

Applicants

ARTIFICIAL LEARNING SYSTEMS INDIA PVT LTD
1665/A, 14th Main Rd, Sector 7, HSR Layout, Bengaluru, Karnataka 560102, India.

Inventors

1. Mrinal Haloi
C/O: Kanak Ch. Haloi,HN: 01,Pashim Barpit,Village Bhojkuchi,PO: Haribhanga District Nalbari, Assam 781378

Specification

Claims:We claim:
1. A deep network 100 for glaucoma prediction by processing an input retinal fundus image of a patient comprising:

a feature encoder 101, the said feature encoder 101 being a convolutional encoder to process the input fundus image to generate output feature maps;

a region of interest (ROI) or proposals extractor 102, the said proposals extractor 102 being one of a deep convolutional network or a computer vision-based image processing algorithm and receives the generated feature maps as input; and

a region of interest (ROI) processor 103 further consisting of three prediction heads:
a ROI classification head;
a ROI co-ordinate prediction head; and
a ROI mask prediction head.

2. The feature encoder 101 of the deep network 100 as claimed in claim 1 being a convolutional encoder taking an input fundus image and processing it to get a rich representation of the input image;

the said convolutional encoder further consisting of N blocks of convolutional layers;

wherein a single convolutional block is consisting of one or more 1 convolutional layers of filter size 3x3 with stride 1 or more.
3. The proposals extractor 102 of the deep network 100 as claimed in claim 1 receives the output feature maps generated by the feature encoder 101 by processing the input fundus images as input comprising:

a base convolutional layer of filter size 3x3 with stride 1;

an objectness prediction convolutional layer of filter size 3x3 with stride; and

a bounding box calculator convolutional layer with filter size 3x3 with stride 1;

wherein the output of the base convolutional layer is fed to the objectness prediction layer to get proposals objectness scores and to the bounding box calculator layer to get proposals coordinates with respect to the reference proposals.

4. The reference proposals generator 300 employed by the deep network 100 generates reference proposals (RP) from one encoded feature map using its width and height as input to the reference proposals generator 300;

wherein 9 different RPs are generated for a point in the feature map using 3 scales S1, S2 and S3 and 3 ratios AR1, AR2 and AR3.

5. The region of interest (ROI) processor 103 of the deep network 100 as claimed in claim 1 consists of three prediction heads;

wherein, for the ROI classification and ROI co-ordinate prediction head, the input ROI is first processed using a base fully connected layer 402 and the output of this step is passed through two separate fully connected layers 402a and 402b to ROI’s representation class and co-ordinates; and

wherein, for the mask prediction head, ROI is passed through two transposed convolutional layers 403a and 403b of filter size 2x2 with stride 2 to get the output mask.

6. A deep-network based glaucoma prediction method for predicting probability of glaucoma by processing input retinal fundus image, said method comprising:
processing of input retinal fundus images using a suitable algorithm to resize them to a size consisting of one of a list of 512x512x3, 1024x1024x3, and 2048 x 2048x3;

feeding the input retinal fundus image to the deep network 100 to get the region of the optic disc and the optic cup;

calculating the vertical cup to disc ratio (vertical CDR) and horizontal cup to disc ratio (horizontal CDR) using the output optic disc and cup regions;

generating a thickness map 501 of the region between the optic cup and the optic disc is using an area calculation algorithm;

calculating the thickness of the inferior, superior, nasal and temporal region of the optic disc from the thickness map 501; and

predicting the probability of glaucoma based on the values of vertical CDR, horizontal CDR, inferior thickness, superior thickness, nasal thickness, and temporal thickness.

7. A method as claimed in claim 6 further comprising:

training of the deep network 100 consisting of the feature encoder 101, proposals extractor 102 and ROI processor 103 is on annotated dataset for glaucoma prediction using but not limiting to the Nesterov momentum optimizer;

the said dataset comprising of retinal fundus images with the optic disc in the field of view, each input image annotated for the region of interests.
, Description:Brief description of the drawings

[0009] The present invention is described with reference to the accompanying figures. The accompanying figures, which are incorporated herein, are given by way of illustration only and form part of the specification together with the description to explain the make and use the invention, in which,

[0010] Figure 1 illustrates a deep network for glaucoma prediction by processing an input retinal fundus image of a patient in accordance with the invention;

[0011] Figure 2 exemplarily illustrates a feature encoder of the deep network to process the input retinal fundus image of the patient;

[0012] Figure 3 exemplarily illustrates a reference proposals generator employed by the deep network;

[0013] Figure 4 exemplarily illustrates a region of interest (ROI) processor employed by the deep network; and

[0014] Figure 5 exemplarily illustrates a deep-network glaucoma prediction method in accordance with the invention.

Detailed description of the invention

[0015] Figure 1 illustrates a deep network 100 for glaucoma prediction by processing an input retinal fundus image of a patient in accordance with the invention. The end- to-end deep network 100 is used to detect, segment and classify the optic disc and optic cup region in a retinal fundus image for glaucoma prediction. The deep network 100 comprises of a feature encoder 101, he feature encoder 101 being a convolutional encoder to process the input fundus image to generate output feature maps; a region of interest (ROI) or proposals extractor 102, the proposals extractor 102 being one of a deep convolutional network or a computer vision-based image processing algorithm and receives the generated feature maps as input; a region of interest (ROI) processor 103 the ROI processor consisting of three prediction heads: a ROI classification head; a ROI co-ordinate prediction head and a ROI mask prediction head.

[0016] Training of the deep network
The deep network 100 consisting of the feature encoder 101, proposals extractor 102 and ROI processor 103 is trained end-to-end on an annotated dataset for glaucoma prediction using but not limiting to the Nesterov momentum optimizer.

[0017] Dataset for training of deep network
Retinal fundus images with the optic disc in the field of view are included in the dataset. Each of the input images to the deep network 100 is annotated for the region of interests. The optic disc and the optic cup region are marked separately to create region of interest for training the deep network 100.

[0018] Processing of input retinal fundus images
The input retinal fundus images are processed using an algorithm like disclosed in the patent application no. PCT/IN2018/050682 titled A SYSTEM AND METHOD FOR DETECTION AND CLASSIFICATION OF RETINAL DISEASE. Input images are resized to a size of 1024 x 1024 x 3.

[0019] Figure 2 exemplarily illustrates a feature encoder 101 of the deep network 100 to process the input retinal fundus image of the patient. In an embodiment, the feature encoder 101 may be a convolutional encoder as shown in the figure. The feature encoder 101 takes an input fundus image and processes it using a convolutional encoder to get a rich representation of the input image.

[0020] Initial parameters of the feature encoder 101 are initialized from a classification network of the same architecture. The classification network is trained using a classification loss function with an input dataset of the same domain.

[0021] The convolutional feature encoder consists of many blocks of convolutional layers CONV BLOCK n, where n can be a number from 1 to N as shown in the figure. A single convolutional block CONV BLOCK ‘n’, where n can be any number from 1 to N, can have more than 1 convolutional layers of filter size 3x3 with stride 1 or more.

[0022] The output feature maps generated by the feature encoder 101 by processing the input fundus images are used as input to the proposals extractor 102. The proposals extractor 102 can be one of a deep convolutional network or computer vision-based image processing algorithm. The proposals extractor 102 consists of a base convolutional layer of filter size 3x3 with stride 1, an objectness prediction convolutional layer of filter size 3x3 with stride 1 and a bounding box calculator convolutional layer with filter size 3x3 with stride 1.

[0023] The output of the base convolutional layer is fed to the objectness prediction layer to get proposals objectness scores and to the bounding box calculator layer to get proposals coordinates with respect to the reference proposals.

[0024] Figure 3 exemplarily illustrates a reference proposals generator 300 employed by the deep network 100. Reference proposals (RP) are generated from one encoded feature map using its width and height as input to the reference proposals generator 300.

[0025] For a point in the feature map, 9 different RPs with that point as a center are generated using 3 scales S1, S2 and S3 and 3 ratios AR1, AR2 and AR3 as shown the figure. Center points for RPs are chosen at an interval of 16 points along the x-axis and y-axis. Proposals passing objectness threshold are processed using a non-maximal suppression algorithm to remove duplicates and overlapping proposals.

[0026] Figure 4 exemplarily illustrates a region of interest (ROI) processor 103 employed by the deep network 100. The ROI processor 103 consists of three prediction heads: ROI classification head; ROI co-ordinate prediction head and ROI mask prediction head.

[0027] All ROIS are aligned using an ROI align layer 401. ROI align layer 401 uses bilinear interpolation and max-pooling to resize all ROIs to a specified size.

[0028] For ROI classification and ROI co-ordinate prediction, the input ROI is first processed using a base fully connected layer 402; the output of this step is passed through two separate fully connected layers 402a and 402b to ROI’s representation class and co-ordinates.

[0029] For the mask prediction, ROI is passed through two transposed convolutional layers 403a and 403b of filter size 2x2 with stride 2 to get the output mask.

[0030] Figure 5 exemplarily illustrates a deep-network glaucoma prediction method 500 in accordance with the invention. Each input image is processed by an input processing method using an algorithm like disclosed in the patent application no. PCT/IN2018/050682 titled A SYSTEM AND METHOD FOR DETECTION AND CLASSIFICATION OF RETINAL DISEASE.

[0031] During inference, a multi-scale input is used. A single image is converted to 3 images of different images of size 512x512x3, 1024x1024x3, and 2048 x 2048x3. Each of the input images is fed to the deep network 100 separately and outputs are combined using a voting mechanism to make the final prediction.

[0032] Glaucoma prediction
Input retinal fundus image is fed to the deep network 100 to get the region of the optic disc and the optic cup. Using the output optic disc and cup regions the vertical cup to disc ratio (vertical CDR) and horizontal cup to disc ratio (horizontal CDR) are calculated.

[0033] A thickness map 501 of the region between the optic cup and the optic disc is generated using an area calculation algorithm. From the thickness map 501, we calculate the thickness of the inferior, superior, nasal and temporal region of the optic disc.

[0034] Based on the values of vertical CDR, horizontal CDR, inferior thickness, superior thickness, nasal thickness, and temporal thickness, the probability of glaucoma is predicted.

[0035] The foregoing examples have been provided merely for the purpose of explanation and does not limit the present invention disclosed herein. While the invention has been described with reference to various embodiments, it is understood that the words are used for illustration and are not limiting. Those skilled in the art, may effect numerous modifications thereto and changes may be made without departing from the scope and spirit of the invention in its aspects.

Documents

Application Documents

# Name Date
1 201941037889-FORM 13 [27-02-2025(online)].pdf 2025-02-27
1 201941037889-FORM-26 [09-09-2024(online)].pdf 2024-09-09
1 201941037889-FORM-27 [19-10-2024(online)].pdf 2024-10-19
1 201941037889-STATEMENT OF UNDERTAKING (FORM 3) [19-09-2019(online)].pdf 2019-09-19
2 201941037889-Correspondence to notify the Controller [21-02-2025(online)].pdf 2025-02-21
2 201941037889-FORM FOR SMALL ENTITY(FORM-28) [19-09-2019(online)].pdf 2019-09-19
2 201941037889-FORM-26 [09-09-2024(online)].pdf 2024-09-09
2 201941037889-RELEVANT DOCUMENTS [13-03-2023(online)].pdf 2023-03-13
3 201941037889-FORM 1 [19-09-2019(online)].pdf 2019-09-19
3 201941037889-FORM-27 [19-10-2024(online)].pdf 2024-10-19
3 201941037889-RELEVANT DOCUMENTS [13-03-2023(online)].pdf 2023-03-13
3 201941037889-RELEVANT DOCUMENTS [24-05-2022(online)].pdf 2022-05-24
4 201941037889-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-09-2019(online)].pdf 2019-09-19
4 201941037889-FORM-26 [09-09-2024(online)].pdf 2024-09-09
4 201941037889-IntimationOfGrant29-03-2022.pdf 2022-03-29
4 201941037889-RELEVANT DOCUMENTS [24-05-2022(online)].pdf 2022-05-24
5 201941037889-RELEVANT DOCUMENTS [13-03-2023(online)].pdf 2023-03-13
5 201941037889-PatentCertificate29-03-2022.pdf 2022-03-29
5 201941037889-IntimationOfGrant29-03-2022.pdf 2022-03-29
5 201941037889-EVIDENCE FOR REGISTRATION UNDER SSI [19-09-2019(online)].pdf 2019-09-19
6 201941037889-Written submissions and relevant documents [23-12-2021(online)].pdf 2021-12-23
6 201941037889-RELEVANT DOCUMENTS [24-05-2022(online)].pdf 2022-05-24
6 201941037889-PatentCertificate29-03-2022.pdf 2022-03-29
6 201941037889-DRAWINGS [19-09-2019(online)].pdf 2019-09-19
7 201941037889-DECLARATION OF INVENTORSHIP (FORM 5) [19-09-2019(online)].pdf 2019-09-19
7 201941037889-IntimationOfGrant29-03-2022.pdf 2022-03-29
7 201941037889-US(14)-HearingNotice-(HearingDate-08-12-2021).pdf 2021-11-12
7 201941037889-Written submissions and relevant documents [23-12-2021(online)].pdf 2021-12-23
8 201941037889-ABSTRACT [05-02-2021(online)]-1.pdf 2021-02-05
8 201941037889-COMPLETE SPECIFICATION [19-09-2019(online)].pdf 2019-09-19
8 201941037889-PatentCertificate29-03-2022.pdf 2022-03-29
8 201941037889-US(14)-HearingNotice-(HearingDate-08-12-2021).pdf 2021-11-12
9 201941037889-ABSTRACT [05-02-2021(online)]-1.pdf 2021-02-05
9 201941037889-ABSTRACT [05-02-2021(online)]-2.pdf 2021-02-05
9 201941037889-FORM-9 [17-07-2020(online)].pdf 2020-07-17
9 201941037889-Written submissions and relevant documents [23-12-2021(online)].pdf 2021-12-23
10 201941037889-ABSTRACT [05-02-2021(online)]-2.pdf 2021-02-05
10 201941037889-ABSTRACT [05-02-2021(online)].pdf 2021-02-05
10 201941037889-STARTUP [18-07-2020(online)].pdf 2020-07-18
10 201941037889-US(14)-HearingNotice-(HearingDate-08-12-2021).pdf 2021-11-12
11 201941037889-ABSTRACT [05-02-2021(online)]-1.pdf 2021-02-05
11 201941037889-ABSTRACT [05-02-2021(online)].pdf 2021-02-05
11 201941037889-CLAIMS [05-02-2021(online)]-1.pdf 2021-02-05
11 201941037889-FORM28 [18-07-2020(online)].pdf 2020-07-18
12 201941037889-ABSTRACT [05-02-2021(online)]-2.pdf 2021-02-05
12 201941037889-CLAIMS [05-02-2021(online)]-1.pdf 2021-02-05
12 201941037889-CLAIMS [05-02-2021(online)]-2.pdf 2021-02-05
12 201941037889-FORM 18A [18-07-2020(online)].pdf 2020-07-18
13 201941037889-FER.pdf 2020-08-04
13 201941037889-CLAIMS [05-02-2021(online)].pdf 2021-02-05
13 201941037889-CLAIMS [05-02-2021(online)]-2.pdf 2021-02-05
13 201941037889-ABSTRACT [05-02-2021(online)].pdf 2021-02-05
14 201941037889-CLAIMS [05-02-2021(online)]-1.pdf 2021-02-05
14 201941037889-CLAIMS [05-02-2021(online)].pdf 2021-02-05
14 201941037889-CORRESPONDENCE [05-02-2021(online)].pdf 2021-02-05
14 201941037889-FORM-26 [11-12-2020(online)].pdf 2020-12-11
15 201941037889-CLAIMS [05-02-2021(online)]-2.pdf 2021-02-05
15 201941037889-CORRESPONDENCE [05-02-2021(online)].pdf 2021-02-05
15 201941037889-Covering Letter [05-02-2021(online)].pdf 2021-02-05
15 201941037889-FORM 3 [24-12-2020(online)].pdf 2020-12-24
16 201941037889-CLAIMS [05-02-2021(online)].pdf 2021-02-05
16 201941037889-Covering Letter [05-02-2021(online)].pdf 2021-02-05
16 201941037889-FER_SER_REPLY [05-02-2021(online)]-1.pdf 2021-02-05
16 201941037889-Request Letter-Correspondence [02-01-2021(online)].pdf 2021-01-02
17 201941037889-CORRESPONDENCE [05-02-2021(online)].pdf 2021-02-05
17 201941037889-FER_SER_REPLY [05-02-2021(online)]-1.pdf 2021-02-05
17 201941037889-FER_SER_REPLY [05-02-2021(online)]-2.pdf 2021-02-05
17 201941037889-Power of Attorney [02-01-2021(online)].pdf 2021-01-02
18 201941037889-Covering Letter [05-02-2021(online)].pdf 2021-02-05
18 201941037889-FER_SER_REPLY [05-02-2021(online)]-2.pdf 2021-02-05
18 201941037889-FER_SER_REPLY [05-02-2021(online)].pdf 2021-02-05
18 201941037889-FORM28 [02-01-2021(online)].pdf 2021-01-02
19 201941037889-FER_SER_REPLY [05-02-2021(online)]-1.pdf 2021-02-05
19 201941037889-FER_SER_REPLY [05-02-2021(online)].pdf 2021-02-05
19 201941037889-OTHERS [05-02-2021(online)]-1.pdf 2021-02-05
19 201941037889-Power of Authority [05-02-2021(online)].pdf 2021-02-05
20 201941037889-FER_SER_REPLY [05-02-2021(online)]-2.pdf 2021-02-05
20 201941037889-OTHERS [05-02-2021(online)]-1.pdf 2021-02-05
20 201941037889-OTHERS [05-02-2021(online)].pdf 2021-02-05
20 201941037889-PETITION u-r 6(6) [05-02-2021(online)].pdf 2021-02-05
21 201941037889-PETITION u-r 6(6) [05-02-2021(online)].pdf 2021-02-05
21 201941037889-OTHERS [05-02-2021(online)].pdf 2021-02-05
21 201941037889-FER_SER_REPLY [05-02-2021(online)].pdf 2021-02-05
22 201941037889-OTHERS [05-02-2021(online)]-1.pdf 2021-02-05
22 201941037889-PETITION u-r 6(6) [05-02-2021(online)].pdf 2021-02-05
22 201941037889-Power of Authority [05-02-2021(online)].pdf 2021-02-05
23 201941037889-FER_SER_REPLY [05-02-2021(online)].pdf 2021-02-05
23 201941037889-FORM28 [02-01-2021(online)].pdf 2021-01-02
23 201941037889-OTHERS [05-02-2021(online)].pdf 2021-02-05
23 201941037889-Power of Authority [05-02-2021(online)].pdf 2021-02-05
24 201941037889-Power of Attorney [02-01-2021(online)].pdf 2021-01-02
24 201941037889-PETITION u-r 6(6) [05-02-2021(online)].pdf 2021-02-05
24 201941037889-FER_SER_REPLY [05-02-2021(online)]-2.pdf 2021-02-05
24 201941037889-FORM28 [02-01-2021(online)].pdf 2021-01-02
25 201941037889-Power of Authority [05-02-2021(online)].pdf 2021-02-05
25 201941037889-Request Letter-Correspondence [02-01-2021(online)].pdf 2021-01-02
25 201941037889-FER_SER_REPLY [05-02-2021(online)]-1.pdf 2021-02-05
25 201941037889-Power of Attorney [02-01-2021(online)].pdf 2021-01-02
26 201941037889-Covering Letter [05-02-2021(online)].pdf 2021-02-05
26 201941037889-FORM 3 [24-12-2020(online)].pdf 2020-12-24
26 201941037889-FORM28 [02-01-2021(online)].pdf 2021-01-02
26 201941037889-Request Letter-Correspondence [02-01-2021(online)].pdf 2021-01-02
27 201941037889-CORRESPONDENCE [05-02-2021(online)].pdf 2021-02-05
27 201941037889-FORM 3 [24-12-2020(online)].pdf 2020-12-24
27 201941037889-FORM-26 [11-12-2020(online)].pdf 2020-12-11
27 201941037889-Power of Attorney [02-01-2021(online)].pdf 2021-01-02
28 201941037889-Request Letter-Correspondence [02-01-2021(online)].pdf 2021-01-02
28 201941037889-FORM-26 [11-12-2020(online)].pdf 2020-12-11
28 201941037889-FER.pdf 2020-08-04
28 201941037889-CLAIMS [05-02-2021(online)].pdf 2021-02-05
29 201941037889-CLAIMS [05-02-2021(online)]-2.pdf 2021-02-05
29 201941037889-FER.pdf 2020-08-04
29 201941037889-FORM 18A [18-07-2020(online)].pdf 2020-07-18
29 201941037889-FORM 3 [24-12-2020(online)].pdf 2020-12-24
30 201941037889-CLAIMS [05-02-2021(online)]-1.pdf 2021-02-05
30 201941037889-FORM 18A [18-07-2020(online)].pdf 2020-07-18
30 201941037889-FORM-26 [11-12-2020(online)].pdf 2020-12-11
30 201941037889-FORM28 [18-07-2020(online)].pdf 2020-07-18
31 201941037889-ABSTRACT [05-02-2021(online)].pdf 2021-02-05
31 201941037889-FER.pdf 2020-08-04
31 201941037889-FORM28 [18-07-2020(online)].pdf 2020-07-18
31 201941037889-STARTUP [18-07-2020(online)].pdf 2020-07-18
32 201941037889-ABSTRACT [05-02-2021(online)]-2.pdf 2021-02-05
32 201941037889-FORM 18A [18-07-2020(online)].pdf 2020-07-18
32 201941037889-FORM-9 [17-07-2020(online)].pdf 2020-07-17
32 201941037889-STARTUP [18-07-2020(online)].pdf 2020-07-18
33 201941037889-FORM28 [18-07-2020(online)].pdf 2020-07-18
33 201941037889-FORM-9 [17-07-2020(online)].pdf 2020-07-17
33 201941037889-COMPLETE SPECIFICATION [19-09-2019(online)].pdf 2019-09-19
33 201941037889-ABSTRACT [05-02-2021(online)]-1.pdf 2021-02-05
34 201941037889-COMPLETE SPECIFICATION [19-09-2019(online)].pdf 2019-09-19
34 201941037889-DECLARATION OF INVENTORSHIP (FORM 5) [19-09-2019(online)].pdf 2019-09-19
34 201941037889-STARTUP [18-07-2020(online)].pdf 2020-07-18
34 201941037889-US(14)-HearingNotice-(HearingDate-08-12-2021).pdf 2021-11-12
35 201941037889-Written submissions and relevant documents [23-12-2021(online)].pdf 2021-12-23
35 201941037889-FORM-9 [17-07-2020(online)].pdf 2020-07-17
35 201941037889-DRAWINGS [19-09-2019(online)].pdf 2019-09-19
35 201941037889-DECLARATION OF INVENTORSHIP (FORM 5) [19-09-2019(online)].pdf 2019-09-19
36 201941037889-COMPLETE SPECIFICATION [19-09-2019(online)].pdf 2019-09-19
36 201941037889-DRAWINGS [19-09-2019(online)].pdf 2019-09-19
36 201941037889-EVIDENCE FOR REGISTRATION UNDER SSI [19-09-2019(online)].pdf 2019-09-19
36 201941037889-PatentCertificate29-03-2022.pdf 2022-03-29
37 201941037889-DECLARATION OF INVENTORSHIP (FORM 5) [19-09-2019(online)].pdf 2019-09-19
37 201941037889-EVIDENCE FOR REGISTRATION UNDER SSI [19-09-2019(online)].pdf 2019-09-19
37 201941037889-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-09-2019(online)].pdf 2019-09-19
37 201941037889-IntimationOfGrant29-03-2022.pdf 2022-03-29
38 201941037889-DRAWINGS [19-09-2019(online)].pdf 2019-09-19
38 201941037889-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-09-2019(online)].pdf 2019-09-19
38 201941037889-FORM 1 [19-09-2019(online)].pdf 2019-09-19
38 201941037889-RELEVANT DOCUMENTS [24-05-2022(online)].pdf 2022-05-24
39 201941037889-EVIDENCE FOR REGISTRATION UNDER SSI [19-09-2019(online)].pdf 2019-09-19
39 201941037889-FORM 1 [19-09-2019(online)].pdf 2019-09-19
39 201941037889-FORM FOR SMALL ENTITY(FORM-28) [19-09-2019(online)].pdf 2019-09-19
39 201941037889-RELEVANT DOCUMENTS [13-03-2023(online)].pdf 2023-03-13
40 201941037889-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-09-2019(online)].pdf 2019-09-19
40 201941037889-FORM FOR SMALL ENTITY(FORM-28) [19-09-2019(online)].pdf 2019-09-19
40 201941037889-FORM-26 [09-09-2024(online)].pdf 2024-09-09
40 201941037889-STATEMENT OF UNDERTAKING (FORM 3) [19-09-2019(online)].pdf 2019-09-19
41 201941037889-FORM 1 [19-09-2019(online)].pdf 2019-09-19
41 201941037889-FORM-27 [19-10-2024(online)].pdf 2024-10-19
41 201941037889-STATEMENT OF UNDERTAKING (FORM 3) [19-09-2019(online)].pdf 2019-09-19
42 201941037889-Correspondence to notify the Controller [21-02-2025(online)].pdf 2025-02-21
42 201941037889-FORM FOR SMALL ENTITY(FORM-28) [19-09-2019(online)].pdf 2019-09-19
43 201941037889-STATEMENT OF UNDERTAKING (FORM 3) [19-09-2019(online)].pdf 2019-09-19
43 201941037889-FORM 13 [27-02-2025(online)].pdf 2025-02-27
44 201941037889-FORM-27 [18-09-2025(online)].pdf 2025-09-18

Search Strategy

1 2021-02-0511-47-48AE_05-02-2021.pdf
2 2020-08-0409-50-06E_04-08-2020.pdf

ERegister / Renewals

3rd: 25 Apr 2022

From 19/09/2021 - To 19/09/2022

4th: 25 Apr 2022

From 19/09/2022 - To 19/09/2023

5th: 25 Apr 2022

From 19/09/2023 - To 19/09/2024

6th: 13 Sep 2024

From 19/09/2024 - To 19/09/2025

7th: 13 Sep 2024

From 19/09/2025 - To 19/09/2026

8th: 13 Sep 2024

From 19/09/2026 - To 19/09/2027

9th: 13 Sep 2024

From 19/09/2027 - To 19/09/2028