Abstract: ABSTRACT A SYSTEM AND METHOD FOR ASSESSING TEA LEAVES The present disclosure explains a system and method for assessing tea leaves. The system includes the following. An enclosure including a top surface (101a) and a bottom surface (101b), wherein a transparent bed (103) is positioned on the top surface (101a) for placing of tea leaves. A lid (109) is connected to the enclosure (101) and movable relative to the transparent bed (103). A plurality of light sources is mounted on a bottom surface of the lid (109) for illuminating the tea leaves and an image capturing device (105) is provided for capturing an image of the tea leaves. A microcontroller (201) is connected to the image capturing device to control and transmit the captured image to a user device (203). A server (211) is connected to the user device (203) for processing the captured image to classify the individual leaves into grades and identify diseases in the individual leaves. Fig. 1, 2
Description:A SYSTEM AND METHOD FOR ASSESSING TEA LEAVES
FIELD
[0001] The embodiments herein generally relate to assessing tea leaves. More particularly, the disclosure relates to an apparatus, system and method for assessing tea leaves and classifying tea leaves.
BACKGROUND AND PRIOR ART
[0002] The process for manufacturing tea involves segregating tea leaves based on their size, shape, quality and multiple other visual features. Tea leaves are classified into different grades like A, B+, B, and C. These grades directly represent the quality of the individual leaf.
[0003] Generally, the tea leaves are manually inspected for determining the grades and identifying presence of any kind of disease such as blister blight, red rust, brown blight, grey blight, twig die back, stem canker, brown root rot disease and red root rot disease. However, the manual process is tedious, time consuming and prone to errors.
[0004] Currently, there are various devices for classifying tea leaves, wherein the devices primarily involve inspecting a large quantity of tea leaves leading to inaccuracies and overlooking of individual leaves. Moreover, the presence of even a few diseased tea leaves due to improper inspection can reduce the total quality of a given batch of tea leaves. Also, there are no devices for specifically identifying the disease in the tea leaves.
[0005] Therefore, there is a need for an efficient apparatus, system and method of assessing each leaf in a batch of tea leaves. Moreover, there is a need for a system and method of individually classifying tea leaves and identifying diseases in the tea leaves.
OBJECTS
[0006] Some of the objects of the present disclosure are described herein below:
[0007] The main objective of the present disclosure is to provide a system and method for assessing tea leaves based on quality as A, B+, B and C grades and determining the percentage of fine leaf present in the given batch of tea leaves.
[0008] Another objective of the present disclosure is to provide a system and method for automated identification of diseased tea leaves and determination of a percentage of the diseased tea leaves in a batch.
[0009] Still another objective of the present disclosure is to provide a system and method for efficient classification of tea leaves based on individual assessment of each tea leaf.
[00010] Yet another objective of the present disclosure is to provide an economical, compact and automated system and method for assessing quality of tea leaves.
[00011] The other objectives and advantages of the present disclosure will be apparent from the following description when read in conjunction with the accompanying drawings, which are incorporated for illustration of preferred embodiments of the present disclosure and are not intended to limit the scope thereof.
SUMMARY
[00012] In view of the foregoing, an embodiment herein provides a system and method for assessing tea leaves.
[00013] In accordance with an embodiment, the system comprises the following. An enclosure including a top surface and a bottom surface wherein a transparent bed is positioned on the top surface. The transparent bed is provided for placing of tea leaves. A lid is connected to the enclosure at the top surface and movable relative to the transparent bed, a plurality of light sources is mounted on a bottom surface of the lid for illuminating the tea leaves. An image capturing device is provided at the bottom surface of the enclosure for capturing an image of the tea leaves. A microcontroller is connected to the image capturing device. The microcontroller is provided for controlling the image capturing device and transmitting the captured image to a user device. The microcontroller is configured for controlling the image capturing device based on input received from the user device. A server is connected to the user device for processing the captured image to classify the individual leaves into grades and identify diseases in the individual leaves.
[00014] In an embodiment, an image processing engine is hosted in the server for analyzing individual pixels of individual leaves to classify into grades including A, B+, B, C and identify diseases including blister blight, red rust, brown blight, grey blight, twig die back, stem canker, brown root rot disease and red root rot.
[00015] In an embodiment, a processor is configured in the server for determining a percentage of leaves of different grades, percentage of fine leaves and coarse leaves, percentage of disease in each leaf and percentage of diseased leaves based on the processing.
[00016] In an embodiment, the processor of the server is configured for identifying diseases of each tea leaf by magnifying to a microscopic level.
[00017] In an embodiment, the microcontroller is connected to the user device through a first communication network including Bluetooth for receiving the input and transmitting the captured images; and the user device is connected to the server through a second communication network including Wi-Fi for transmitting the captured images and receiving processed data including percentage of tea leaves of different grades, percentage of fine leaves and coarse leaves, percentage of disease in each leaf, percentage of tea leaves with diseases and identified diseases of the tea leaves.
[00018] In an embodiment, a user interface is provided in the user device, configured for receiving the input for controlling operation of the image capturing device from a user and displaying percentage of tea leaves of different grades, percentage of fine leaves and coarse leaves, percentage of disease in each leaf, percentage of tea leaves with diseases and identified diseases of the tea leaves.
[00019] In accordance with an embodiment, the method for assessing tea leaves, comprises the steps of positioning tea leaves on a transparent bed of an enclosure, illuminating the tea leaves, by a light source, controlling an image capturing device for capturing an image of the illuminated tea leaves, by a microcontroller, capturing an image of the illuminated tea leaves, by the image capturing device, processing individual tea leaves in the captured image, by a server and classifying the individual tea leaves into different grades and identifying diseases in each of the tea leaves based on the processing, by the server.
[00020] In an embodiment, the microcontroller is connected to a user device through a first communication network for receiving input to control the image capturing device and transmitting captured images to the user device and the user device is connected to the server through a second communication network for transmitting the captured images to the server and receiving processed data including percentage of tea leaves of different grades, percentage of fine leaves and coarse leaves, percentage of disease in each leaf, percentage of tea leaves with diseases and identified diseases of the tea leaves.
[00021] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF DRAWINGS
[00022] The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items.
[00023] Fig.1 illustrates a schematic of an apparatus for assessing tea leaves, according to an embodiment herein;
[00024] Fig.2 illustrates a block diagram of a system for assessing tea leaves, according to an embodiment herein; and
[00025] Fig.3 illustrates a flow chart of a method for assessing tea leaves, according to an embodiment herein.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[00026] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[00027] As mentioned above, there is a need for an efficient apparatus, system and method of assessing each leaf in a batch of tea leaves. In particular, there is a need for a system and method of individually classifying tea leaves and identifying diseases in the tea leaves. The embodiments herein achieve this by providing “A system and method for assessing tea leaves”. Referring now to the drawings and more particularly to Fig.1 and Fig.3, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[00028] Fig. 1a illustrates a schematic of graded tea leaves on a tea leaf branch. The tea leaves are graded based on a position on the stem and stage including bud and/or leaf, into A, B+, B and C.
[00029] Fig.1b illustrates a schematic of an apparatus for assessing tea leaves. The apparatus includes an enclosure 101, a top surface 101a, a bottom surface 101b, a transparent bed 103, an image capturing device 105, a control unit 107, a light source and a lid 109. The enclosure 101 includes a top surface 101a and a bottom surface 101b.
[00030] In an embodiment, the image capturing device 105 and the control unit 107 are placed at the bottom surface 101b inside the enclosure. In an embodiment, the transparent bed 103 is placed on the top surface 101a of the enclosure 101. The transparent bed 103 is provided for placing of tea leaves. The tea leaves are spread on the transparent bed without overlap.
[00031] The transparent bed 103 is made of a material including but not limited to glass, wherein the transparent bed 103 facilitates a clear display of the tea leaves from a top and bottom.
[00032] In an embodiment, the lid 109 is connected to the enclosure 101 at the top surface 101a. The lid 109 is hinged to the top surface 101a of the enclosure 101, wherein the lid 109 is movable relative to the transparent bed 103. In an embodiment, the light sources are mounted on a bottom surface of the lid 109 for illuminating tea leaves positioned on the transparent bed 103. A diffusion sheet is fixed over the light sources at the bottom surface of the lid 10 for diffusing the light emitted from the light sources and illuminating the tea leaves evenly. In an embodiment, the diffusion sheet provides a clear background to the tea leaves, thereby eliminating noise.
[00033] In an embodiment, the light sources are switched on when the lid 109 is in a closed position and placed over the transparent bed 105. The light sources includes but not limited to LED lights.
[00034] In an embodiment, the image capturing device 105 is provided at a bottom surface 101b of the enclosure 101 for capturing an image of the illuminated tea leaves through the transparent bed 103. The illumination of the tea leaves enhances details of the tea leaves in the image captured by the image capturing device 105.
[00035] The image capturing device 105 includes but not limited to a camera. In an embodiment, the image capturing device 105 is connected to the control unit 107. The control unit 107 includes but not limited to a microcontroller. The image capturing device 105 transmits the captured image of the illuminated tea leaves to the microcontroller of the control unit 107.
[00036] In an embodiment, dimensions of the enclosure 101 is based on parameters of lens, focal length and field of view of the image capturing device 105 for capturing a clear image of the tea leaves spread on the transparent bed 103.
[00037] In an embodiment, a power source is connected to the control unit 107 for supplying power to components of the apparatus including light sources, image capturing device. The power source included but not limited to a battery and an external power switch connected through wire.
[00038] In an exemplary embodiment, the image capturing device 105 is a Pi camera. In an exemplary embodiment, the control unit is a raspberry pi board. The Pi camera can be interfaced to the raspberry pi board using a CSI cable. The Pi camera is positioned inside the enclosure 101 based on an optical focal length, wherein the optical focal length is determined by testing of different focal lengths based on the field of view of lens of the Pi camera. The transparent bed 103 can accommodate around 100 leaves for facilitating capturing clear image of each of the tea leaves.
[00039] Fig. 2 illustrates a block diagram of a system for assessing tea leaves. The system comprises the apparatus 100 including the control unit 107 and the microcontroller 201, a user device 203 and a server 211. In an embodiment, the microcontroller 201 is connected to the user device 203 through a first communication network. The first communication network includes but not limited to Bluetooth. In an embodiment, the user device 203 is connected to the server 211 through a second communication network. The second communication network includes but not limited to Wi-Fi.
[00040] In an embodiment, the user device 203 includes a processor 205, a memory 207 and a user interface 209. In an embodiment, the memory 207 is provided for storing instructions and the processor 205 is provided for executing instructions stored in the memory. The instructions includes connecting to the microcontroller 201 through the first communication network, connecting to the server 211 through the second communication network, receiving captured image of the tea leaves from microcontroller 201, saving the captured images, transmitting input received from a user to the microcontroller 201 and transmitting the captured images to the server 211.
[00041] In an embodiment, the user interface 209 can be configured for receiving input from a user and displaying processed data received from the server 213. The input can be commands including but not limited to controlling operation of the image control device 105 for capturing an image of the tea leaves.
[00042] In an embodiment, the user device 203 includes but not limited to a smart phone, a laptop, any electronic communication device.
[00043] In an embodiment, the server 211 includes a processor 213, a memory 215 and a database 217. In an embodiment, the memory can be configured for storing instructions and the processor 213 can be configured for executing the stored instructions. In an embodiment, the instructions can include but not limited to connecting to the user device 203 through the second communication network and receiving captured images from the user device, processing the captured images and transmitting processed data based on the processing to the user device.
[00044] In an embodiment, an image processing engine 217 is hosted on the server 211. The image processing engine 217 includes a model trained for identifying grades and diseases in tea leaves. The model is trained using plurality of images of identified and annotated tea leaves. The image processing engine 217 uses the trained model for identifying grades and diseases in the tea leaves. The trained model including weight, configurations associated with the grades and diseases of the tea leaves are hosted by the image processing engine on the server 211 using an API.
[00045] In a preferred embodiment, the trained model is MaskRCNN model, wherein weights and configurations of the model are locally saved as torch data files. A flask API loads the model, processes the incoming requests, and returns percentage of tea leaves in different grades.
[00046] In operation, the tea leaves are spread and placed on the transparent bed with minimal overlap, by a user. The microcontroller 201 of the apparatus 100 is connected to the processor 205 of the user device, through the first communication network. The user interface 209, through the user device 203, transmits input received from the user, to the microcontroller 201 for operating the image capturing device 105. In an embodiment, plurality of lights provided in the apparatus is switched ON for illuminating the tea leaves placed on the transparent bed.
[00047] The microcontroller 201 controls the image capturing device 105 for capturing an image of illuminated tea leaves placed on the transparent bed 103 of the apparatus. The captured image is transmitted by the microcontroller 201 to the user device 203, through the first communication network. The processor 205 of the user device 203 is connected to the server 211 through the second communication network. The processor 205 of the user device 203 transmits the captured image to the server 211 for processing.
[00048] The image processing engine 217 in the server 217 processes the captured image using the processor 213. In an embodiment, the image processing engine 217 of the server 211 analyses each individual pixel of the captured image for identifying the grades and classifying each leaf based on the trained model. Then, the image processing engine 217 detects and identifies the disease in each of the tea leaf in the captured image. In an embodiment, the image processing engine 217 magnifies each tea leaf to microscopic level for determining percentage of the disease in an individual tea leaf.
[00049] Finally, the processor 213 determines percentage of tea leaves of different grades, percentage of fine leaves and coarse leaves, percentage of disease in each leaf, percentage of tea leaves with diseases and identified diseases of the tea leaves. In an embodiment, the percentage of fine and coarse tea leaves is determined based on the identified percentage of grades of the tea leaves. A first selected grades of the tea leaves corresponds to fine leaves and second selected grades of the tea leaves correspond to coarse leaves. The first selected grades and the second selected grades are set by the user in the server 211, for determining percentage of fine leaves and coarse leaves.
[00050] The processor 213 determines the percentage of tea leaves in different grades based on a number of tea leaves in different grades. The processed data including percentage of different grades of tea leaves, percentage of fine and coarse tea leaves, percentage of leaves with diseases, percentage of disease in each leaf and identified diseases in the tea leaves are transmitted by the processor 213 of the server 211 to the user device 203. The user interface 209 of the user device 203 displays the processed data.
[00051] Fig. 3 illustrates a method of assessing tea leaves. The method includes the following steps. First, positioning 301 tea leaves on a transparent bed of an enclosure. Then, illuminating the tea leaves, by a light source. Next, controlling 303 an image capturing device for capturing an image of the illuminated tea leaves, by a microcontroller connected to the image capturing device. Then, capturing 304 an image of the illuminated tea leaves, by an image capturing device. Next, transmitting the captured image to a user device, by the microcontroller connected to the image capturing device and transmitting the captured image to a server, by the user device. Then, processing 305 individual tea leaves in the captured image, by a server and classifying 306 the individual tea leaves and identifying diseases in each of the tea leaves based on the processing, by the server.
[00052] Experimental data:
Percentage of fine leaves in a given batch of tea leaves were identified in 34 centres manually and using the apparatus assessing tea leaves, according to an embodiment herein. The percentage of tea leaves are provided in Table 1 herein below.
Testing Centres MANUAL % APPARATUS %
Centre 1 46 58
Centre 2 35 56
Centre 3 47 49
Centre 4 48 75
Centre 5 35 45
Centre 6 43 64
Centre 7 40 70
Centre 8 40 48
Centre 9 48 51
Centre 10 36 68
Centre 11 30 66
Centre 12 42 82
Centre 13 48 60
Centre 14 30 56
Centre 15 50 74
Centre 16 50 60
Centre 17 57 68
Centre 18 43 75
Centre 19 52 78
Centre 20 47 54
Centre 21 46 63
Centre 22 39 48
Centre 23 35 77
Centre 24 58 62
Centre 25 35 75
Centre 26 45 65
Centre 27 40 71
Centre 28 41 58
Centre 29 30 62
Centre 30 57 67
Centre 31 45 55
Centre 32 40 76
Centre 33 59 61
Centre 34 57 70
Table 1
[00053] As illustrated in Table 1, the percentage of fine tea leaves identified from a given batch is clearly higher when using the apparatus compared to the manual method. Thereby, the apparatus facilitates accurate and faster identification of tea leaves.
[00054] A main advantage of the present disclosure is that the system and method provides assessing tea leaves.
[00055] Another advantage of the present disclosure is that the system and method provides classifying and identifying diseases in individual tea leaves.
[00056] Still another advantage of the present disclosure is that the system and method provides assessing tea leaves and classifying based on quality as fine and coarse leaves.
[00057] Still another advantage of the present disclosure is that the system and method provides an economical, compact and accurate system and method for, classifying and identifying diseases in tea leaves.
[00058] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein
, Claims:We Claim:
1. A system for assessing tea leaves, comprising:
an enclosure including a top surface (101a) and a bottom surface (101b);
wherein a transparent bed (103), positioned on the top surface (101a), provided for placing of tea leaves;
a lid (109) connected to the enclosure (101) at the top surface (101a) and movable relative to the transparent bed (103);
a plurality of light sources mounted on a bottom surface of the lid (109) for illuminating the tea leaves;
an image capturing device (105) provided at the bottom surface (101b) of the enclosure (101) for capturing an image of the tea leaves;
a microcontroller (201) connected to the image capturing device (105), provided for controlling the image capturing device (105) and transmitting the captured image to a user device (203);
wherein the microcontroller (201) configured for controlling the image capturing device (105) based on input received from the user device (203); and
a server (211) connected to the user device (203) for processing the captured image to classify the individual leaves into grades and identify diseases in the individual leaves.
2. The system as claimed in claim 1, wherein an image processing engine (217) hosted in the server (211) for analyzing individual pixels of individual leaves to classify into grades including A, B+, B, C and identify diseases including blister blight, red rust, brown blight, grey blight, twig die back, stem canker, brown root rot disease and red root rot.
3. The system as claimed in claim 1, wherein a processor (213) configured in the server (211) for determining a percentage of leaves of different grades, percentage of fine leaves and coarse leaves, percentage of diseased leaves and percentage of disease in the individual leaf based on the processing.
4. The system as claimed in claim 1, wherein the processor (213) of the server (211) configured for identifying percentage of disease in each tea leaf by magnifying the individual tea leaf in the captured image to a microscopic level.
5. The system as claimed in claim 1, wherein the microcontroller (201) connected to the user device (201) through a first communication network including Bluetooth for receiving the input and transmitting the captured images; and
wherein the user device (201) connected to the server (211) through a second communication network including Wi-Fi for transmitting the captured images and receiving processed data including percentage of tea leaves of different grades, percentage of disease in each leaf, percentage of leaves with diseases, percentage of fine leaves and coarse leaves and identified diseases of the tea leaves.
6. The system as claimed in claim 1, wherein a user interface (203) provided in the user device (203), configured for:
receiving the input for controlling operation of the image capturing device (105) from a user; and
displaying percentage of tea leaves of different grades, percentage of fine leaves and coarse leaves, percentage of tea leaves with diseases, percentage of disease in each leaf and identified diseases of the tea leaves.
7. A method for assessing tea leaves, comprising the steps of:
positioning (301) tea leaves on a transparent bed of an enclosure;
illuminating (302) the tea leaves, by a light source;
controlling (303) an image capturing device for capturing an image of the illuminated tea leaves, by a microcontroller;
capturing (304) an image of the illuminated tea leaves, by the image capturing device;
processing (305) individual tea leaves in the captured image, by a server; and
classifying (306) the individual tea leaves into different grades and identifying diseases in each of the tea leaves based on the processing, by the server.
8. The method as claimed in claim 1, wherein the microcontroller connecting to a user device through a first communication network for receiving input to control the image capturing device and transmitting captured images to the user device; and
wherein the user device connecting to the server through a second communication network for transmitting the captured images to the server and receiving processed data including percentage of tea leaves of different grades, percentage of fine leaves and coarse leaves, percentage of tea leaves with diseases, percentage of disease in each leaf and identified diseases of the tea leaves.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 202241049965-IntimationOfGrant15-03-2023.pdf | 2023-03-15 |
| 1 | 202241049965-STATEMENT OF UNDERTAKING (FORM 3) [01-09-2022(online)].pdf | 2022-09-01 |
| 2 | 202241049965-PatentCertificate15-03-2023.pdf | 2023-03-15 |
| 2 | 202241049965-POWER OF AUTHORITY [01-09-2022(online)].pdf | 2022-09-01 |
| 3 | 202241049965-FORM 1 [01-09-2022(online)].pdf | 2022-09-01 |
| 3 | 202241049965-Annexure [09-03-2023(online)].pdf | 2023-03-09 |
| 4 | 202241049965-Response to office action [09-03-2023(online)].pdf | 2023-03-09 |
| 4 | 202241049965-DRAWINGS [01-09-2022(online)].pdf | 2022-09-01 |
| 5 | 202241049965-DECLARATION OF INVENTORSHIP (FORM 5) [01-09-2022(online)].pdf | 2022-09-01 |
| 5 | 202241049965-Correspondence to notify the Controller [28-02-2023(online)].pdf | 2023-02-28 |
| 6 | 202241049965-US(14)-HearingNotice-(HearingDate-09-03-2023).pdf | 2022-12-01 |
| 6 | 202241049965-COMPLETE SPECIFICATION [01-09-2022(online)].pdf | 2022-09-01 |
| 7 | 202241049965-FORM-9 [06-09-2022(online)].pdf | 2022-09-06 |
| 7 | 202241049965-CLAIMS [29-11-2022(online)].pdf | 2022-11-29 |
| 8 | 202241049965-MSME CERTIFICATE [07-09-2022(online)].pdf | 2022-09-07 |
| 8 | 202241049965-FER_SER_REPLY [29-11-2022(online)].pdf | 2022-11-29 |
| 9 | 202241049965-FORM28 [07-09-2022(online)].pdf | 2022-09-07 |
| 9 | 202241049965-OTHERS [29-11-2022(online)].pdf | 2022-11-29 |
| 10 | 202241049965-FER.pdf | 2022-09-13 |
| 10 | 202241049965-FORM 18A [07-09-2022(online)].pdf | 2022-09-07 |
| 11 | 202241049965-FER.pdf | 2022-09-13 |
| 11 | 202241049965-FORM 18A [07-09-2022(online)].pdf | 2022-09-07 |
| 12 | 202241049965-FORM28 [07-09-2022(online)].pdf | 2022-09-07 |
| 12 | 202241049965-OTHERS [29-11-2022(online)].pdf | 2022-11-29 |
| 13 | 202241049965-FER_SER_REPLY [29-11-2022(online)].pdf | 2022-11-29 |
| 13 | 202241049965-MSME CERTIFICATE [07-09-2022(online)].pdf | 2022-09-07 |
| 14 | 202241049965-CLAIMS [29-11-2022(online)].pdf | 2022-11-29 |
| 14 | 202241049965-FORM-9 [06-09-2022(online)].pdf | 2022-09-06 |
| 15 | 202241049965-COMPLETE SPECIFICATION [01-09-2022(online)].pdf | 2022-09-01 |
| 15 | 202241049965-US(14)-HearingNotice-(HearingDate-09-03-2023).pdf | 2022-12-01 |
| 16 | 202241049965-Correspondence to notify the Controller [28-02-2023(online)].pdf | 2023-02-28 |
| 16 | 202241049965-DECLARATION OF INVENTORSHIP (FORM 5) [01-09-2022(online)].pdf | 2022-09-01 |
| 17 | 202241049965-DRAWINGS [01-09-2022(online)].pdf | 2022-09-01 |
| 17 | 202241049965-Response to office action [09-03-2023(online)].pdf | 2023-03-09 |
| 18 | 202241049965-FORM 1 [01-09-2022(online)].pdf | 2022-09-01 |
| 18 | 202241049965-Annexure [09-03-2023(online)].pdf | 2023-03-09 |
| 19 | 202241049965-POWER OF AUTHORITY [01-09-2022(online)].pdf | 2022-09-01 |
| 19 | 202241049965-PatentCertificate15-03-2023.pdf | 2023-03-15 |
| 20 | 202241049965-STATEMENT OF UNDERTAKING (FORM 3) [01-09-2022(online)].pdf | 2022-09-01 |
| 20 | 202241049965-IntimationOfGrant15-03-2023.pdf | 2023-03-15 |
| 1 | 202241049965E_12-09-2022.pdf |