Sign In to Follow Application
View All Documents & Correspondence

Grains And Spices Quality Assessment System And Method

Abstract: ABSTRACT GRAINS AND SPICES QUALITY ASSESSMENT SYSTEM AND METHOD A grain quality assessment system (102) for grade assessment of grains is provided herein. The grain quality assessment system (102) includes an image acquisition module (202) configured to capture image of an agriculture produce lying on a predetermined background cloth with a reference object, and a lighting compensation module (204) configured to compensate for variation in lighting during image capturing of the grains based on image of the reference object. The grain quality system (102) further 10 includes an image processing module (206) configured to process the captured image to perform object detection, object segmentation, and feature extraction for the rice grains. The grain quality assessment system (102) further includes an output module (208) configured to generate a grade assessment report about accuracy of size, color, and defect detection of the agriculture produce. Figure of Abstract: FIG.1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
16 April 2020
Publication Number
43/2021
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
avbreddy9@gmail.com
Parent Application

Applicants

AGRICXLAB PRIVATE LIMITED
B/1403, TWINKLE TOWER DHOKALI BALKUM ROAD DHOKALI, THANE, MAHARASHTRA- 400607

Inventors

1. Dishant Arora
B/1403, TWINKLE TOWER DHOKALI BALKUM ROAD DHOKALI, THANE, MAHARASHTRA- 400607
2. Sri Harsha Konuru
B/1403, TWINKLE TOWER DHOKALI BALKUM ROAD DHOKALI, THANE, MAHARASHTRA- 400607
3. Veebhor Jain
B/1403, TWINKLE TOWER DHOKALI BALKUM ROAD DHOKALI, THANE, MAHARASHTRA- 400607
4. Ritesh Dhoot
B/1403, TWINKLE TOWER DHOKALI BALKUM ROAD DHOKALI, THANE, MAHARASHTRA- 400607
5. Rahul Kumar Chaurasia
B/1403, TWINKLE TOWER DHOKALI BALKUM ROAD DHOKALI, THANE, MAHARASHTRA- 400607

Specification

DESC:
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
The Patents Rules, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)

1. TITLE OF THE INVENTION
GRAINS AND SPICES QUALITY ASSESSMENT SYSTEM AND METHOD

2. APPLICANT
(a) NAME: AGRICXLAB PRIVATE LIMITED
(b) NATIONALITY: INDIAN
(c) ADDRESS: B/1403, Twinkle Tower,
Dhokali Balkum Road,
Thane, Maharashtra, India – 400607

3. PREAMBLE TO THE DESCRIPTION
COMPLETE
The following specification particularly describes the invention and the manner in which it is to be performed.
4. DESCRIPTION
FIELD OF THE INVENTION
Embodiments of the present invention, generally relate to grading of grains, and in particular relate to grade assessment of grains by image processing.

BACKGROUND OF THE INVENTION
Optimum value realization for agriculture produce is necessary for farmers and other stakeholders in the agriculture value chain to increase their revenue and profits. Conventional practices of quality inspection, grading, and safety control for agriculture produce rely on manual inspections by buyers and sellers.

However, these manual inspections of grading and quality assessment are labor intensive and time consuming. Further, even accuracy of grading may be jeopardized due to subjective human judgments. Furthermore, it is not always easy for humans to perform fast and accurate grading and quality assessment of the agriculture produce.

Hence, traditional method of agricultural produce quality assessment is tedious and costly. It is easily influenced by subjective and inconsistent evaluation results. Agricultural produce quality plays a critical role in all food industry quality assessments. Thus, in some conventional methods, computer technologies such as image capturing and sending have been utilized in order to construct new machines for agricultural produce quality assessment. For value determination of the agriculture produce, images of the produce are captured at a warehouse and sent to buyer at different locations to determine the price and quality of the produce.

However, even these conventional methods also suffer from many disadvantages. First, due to variation in lighting conditions in the warehouse, image of the grains is also affected and does not present exact quality of the produce. Further, the agricultural produce is characterized by a wide variety of physical characteristics, and features, which are associated with various aspects relating to agriculture, horticulture, environment, geography, climate, and ecology of the agriculture crop from which agriculture produce is derived. These factors result in wide variety in quality of the agriculture produce, which necessities changes in the associated values of agriculture produce.

Therefore, there is a need for an improved system and method for grading of agriculture produce which solves above disadvantages associated with the conventional methods.

SUMMARY OF THE INVENTION
According to an aspect of the present disclosure, grain quality assessment system (102) for quality assessment of grains is provided herein. The grain quality assessment system (102) includes an image acquisition module (202) configured to capture image of the grains lying on a predetermined tray with a reference object, and a lighting compensation module (204) configured to compensate for variation in lighting during image capturing of the grains based on image of the reference object. The grain quality assessment system (102) further includes an image processing module (206) configured to process the captured image to perform object detection, object segmentation, and feature extraction for the grains. The grain quality assessment system (102) further includes an output module (208) configured to output a grade assessment report about accuracy of size, color, and defect detection of the agriculture produce.
According to another aspect of the present disclosure, a computer-implemented method for grade assessment of agriculture produce is provided herein. The computer-implemented method includes capturing image of an the grains lying on a predetermined tray with a grid affixed on the top and a reference object, and compensating for variation in lighting during image acquisition of the agriculture produce, based on image of the reference object. The computer-implemented method further includes processing the captured image to perform object detection, object segmentation, and feature extraction for the agriculture produce, and providing a grade assessment report about accuracy of size, color, and defect detection of the agriculture produce.
The preceding is a simplified summary to provide an understanding of some aspects of embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
Other features of the embodiments will be apparent from the accompanying drawings and from the detailed description that follows.

BRIEF DESCRIPTION OF THE DRAWINGS
The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
FIG. 1 is a block diagram depicting a network environment according to an embodiment of the present invention;
FIG. 2 is a block diagram of modules stored in memory, according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an captured image of agriculture produce (rice grains), according to an embodiment of the present invention;

FIG. 4 is a schematic diagram of an masked image created by using U-NET model during image processing, according to an embodiment of the present invention;

FIG. 5 is a schematic diagram of an captured image of agriculture produce (Bengal gram), according to an embodiment of the present invention;

FIG. 6 is a schematic diagram of an masked image created by U-NET model for the Bengal gram during image processing, according to an embodiment of the present invention;

FIG. 7 is a schematic diagram of the captured image after applying distance transform during image processing, according to an embodiment of the present invention;

FIG. 8 is a schematic diagram of the labelling peaks of image during image processing, according to an embodiment of the present invention;

FIG. 9 is a schematic diagram of image after applying watershed algorithm during image processing, according to an embodiment of the present invention; and

FIG. 10 depicts an exemplary flowchart illustrating a grade assessment method of agriculture produce, according to an embodiment of the present invention.

To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures.
The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present invention in any way.

DETAILED DESCRIPTION OF THE INVENTION
As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to.
The phrases “at least one”, “one or more”, and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising”, “including”, and “having” can be used interchangeably.
The term “automatic” and variations thereof, as used herein, refers to any process or operation done without material human input when the process or operation is performed. However, a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation. Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be “material”.
FIG. 1 illustrates an exemplary network environment (100) where various embodiments of the present invention may be implemented. The network environment (100) includes an grain quality assessment system (102) connected to various electronic devices 104a (mobile), 104b (tablet),...104n, (hereinafter referred as 104) via a network (106). The Network (106) may include, but is not restricted to, a communication network such as Internet, PSTN, Local Area Network (LAN), Wide Area Network (WAN), Metropolitan Area Network (MAN), and so forth. In an embodiment, the network (106) can be a data network such as the Internet. Further, the messages exchanged between the grain quality assessment system (102) and the mobile devices (104) can comprise any suitable message format and protocol capable of communicating the information necessary for the grain quality assessment system (102) to provide grade assessment of the sample. The mobile devices (104) may utilize the grain quality assessment system (102) to capture images of the grains and provide the captured images to the grain quality assessment system (102).
In an embodiment of the present invention, the grain quality assessment system (102) may be a computing device. In operation, a user of the mobile device (104) may access the grain quality assessment system (102) to capture an image of the agriculture grains spread on a grid with a reference object. The grain quality assessment system (102) includes a processor (110) and a memory (112). In one embodiment, the processor (110) includes a single processor and resides at the grain quality assessment system (102). In another embodiment, the processor (110) may include multiple sub-processors and may reside at the agriculture produce grade assessment system as well as the mobile device.
Further, the memory (112) includes one or more instructions that may be executed by the processor (110) to capture image of the grains lying on a predetermined tray with grid affixed on top of the tray and a reference object, process the captured image to perform object detection, object segmentation, and feature extraction for the and generate a report about accuracy of size, color, and defect detection of the agriculture produce. In one embodiment, the memory (112) includes the modules (114), a database (116), and other data (not shown in figure). The other data may include various data generated during processing the captured images. In one embodiment, the database (116) is stored internal to the grain quality assessment system (102). In another embodiment, the database (116) may be stored external to the grain quality assessment system (102), and may be accessed via the network (106). Furthermore, the memory (112) of the grain quality assessment system (102) is coupled to the processor (110).
Referring to FIG. 2, the module (114) includes an image acquisition module (202), a lighting compensation module (204), an image processing module (206), an output module (208), and a training module (210). The modules (114) are instructions stored in the memory and may process a captured image to facilitate grading of the agriculture produce.
In an embodiment of the present invention, the grains lying on the predetermined tray may include but not restricted to cereal grains like rice, paddy, wheat, oats, etc., or pulses like Bengal gram, green gram, Soybean, etc., or spices like cardamom, pepper, cumin, etc., The shape of the grains may be cylindrical (rice), ellipsoidal (green gram), spherical (toor dal), kidney shape (kidney beans, cashew), or irregular shape(wheat).

The image acquisition module (202) is configured to capture image of an agriculture produce lying on a predetermined tray with a reference object. In an embodiment, before capturing the image of the agriculture produce, a sample of the agriculture produce is prepared. The sample is spread on the predetermined tray having a different or contrasting color (for example, black, blue, or red color) than the grains. A predetermined grid is affixed on the tray having different or contrasting color than the background tray and the agricultural produce, and size of the grid may be 2cm X 2 cm.
FIG. 2 illustrates an grid affixed on the tray (100) on which the grains are spread. The shape of the grid may include but not restricted to geometrical structure like square, rectangle, rhombus, circle, or any polygon and so forth.
Further, the reference object is kept on the tray. In an embodiment, the reference object facilitates minimizing the distortion due to varied lighting conditions. The reference object has a different or contrasting and bright color such as red, blue, green, or any contrasting color with respect to the grains & background color. Further, the texture of the reference object is non-reflecting to avoid glare, shape is square or circular or any symmetrical geometric shape, and size of the reference object may be from 10 mm to 50 mm.

In an embodiment, the sample of agriculture produce such as grains is spread on the tray away from the reference object. The sample is spread so that there is no piling of grains. Further, the sample may have a count of around 50-200 units of agriculture produce such as grains. Those skilled in the art will appreciate that location for image acquisition may be cold storage, farm, or plant and does not require a laboratory. After spreading the grains on the background tray having reference object well, an image of the grains is captured using a camera of mobile device, as shown in FIG. 3.

Further, lighting compensation module (204) is configured to compensate for variation in lighting based on the reference object. According to an embodiment of the present invention, the lighting variation is compensated by calculating the degree of variation in color of the reference object, as captured from the already known color of the reference object, and this variation in the lighting is applied to all the pixels of the captured image by the light compensation module (204).

Further, an image processing module (206) configured to process the captured image and detect objects of the grains. According to an embodiment of the present invention, the image processing module (206) includes various sub-modules to process the captured image and perform object detection for accurate count, color determination, detection of exact boundaries of each object, classification of each object for the defect type and defect variant, and detection of presence of certain chemicals & pathogens for food safety. In an embodiment, the sub-modules of the image processing module (206) include, but not limited to, object detection sub-module (206-A), object segmentation sub-module (206-B), feature extraction sub-module (206-C), and object classification sub-module (206-D).

In an embodiment, the object detection sub-module (206-A) may first determine the pixel to cm ratio using a marker of known dimension, then the colorspace is modified by converting BGR (blue, green, and red) into RGB (red, green, and blue).

Further, in an embodiment, the object segmentation sub-module (206-B) may apply U-NET model to get masked image, as shown in FIG. 4, and then may modify the micro images produced by the object detection sub-module into masked micro-images. Further, the object segmentation module (206-B) may apply contouring on the masked micro images. During contouring, the object segmentation sub-module is configured to first apply a distance transform to the mask, as shown in FIG. 5.
Further, the object segmentation sub-module is configured to label each of the peaks with unique values, as shown in FIG. 6. Further, the object segmentation sub-module is configured to apply a watershed algorithm, as shown in FIG. 7. Further, the object segmentation sub-module is configured to use the output of watershed algorithm to get a different mask having micro-images where center of grain is present, and then contouring is applied on the mask.

In an embodiment, the feature extraction sub-module (206-C) of the image processing module (206) is configured to calculate height, width, length, breadth, and volume of the agriculture produce. The feature extraction sub-module of the image processing module (206) may first fit the minimum area rectangle into the contours of grains image, find length and breadth using coordinates of the edges, and multiply the obtained length and width with cm per pixel ratio to calculate the height and width of individual object (such as grain) in the agriculture produce.
Further, the feature extraction sub-module may determine length by calculating maximum distance between any two points on the contour, and may determine width by calculating maximum distance between any two points on the contour along perpendicular axis to axis of shortest length. Further, the feature extraction sub-module may calculate volume by using a formula of volume (4/3)*(3.14*length*width*breadth) or as according to the shape defined from the produce.

Further, an object classification sub-module (206-D) of the image processing module (206) may perform classification of each object for the defect type and defect variant, and detection of presence of certain chemicals & pathogens for food safety by using deep learning classification and auto-encoder.
Finally, an output module (208) is configured to output a quality assessment report of grains, having accuracy of size, distribution, and defects of grains sample. In an embodiment, the output module (208) may provide output in form of a report about the quality of agriculture produce such as size distribution, color distribution, external defect detection, internal defect detection, chemical and microbial profiling.

In an exemplary report provided by the output module (208) of the grain quality assessment system (102), size distribution may be 99% accurate, color distribution may be 95% accurate, external defect detection may be 95% accurate, internal defect detection may be 95% accurate, chemical and microbial profiling may be 95% accurate. Those skilled in the art will appreciate that images of the grains may be captured in a warehouse, and the grade assessment may be performed at a remote location, and a buyer may utilize the grade assessment to determine the price and quality of the produce.
Further, in an embodiment, the training module (210) is configured to train the image processing for various analysis of the image. In an embodiment, standard U-Net architecture may be used by the training module (210). Further, training images may consist of color photographs of grains spread across on the tray. The grains may be well spread or touching each other.
In an embodiment, for each training image, the training module (210) may utilize a mask image to train the U-Net. In an embodiment, the mask image is a gray scale image where the grain regions may be marked as red and the rest of the objects and the background may be blue in color and the border may be green in color. Before feeding the images into the U-Net algorithm, the training module (210) is configured to split each image into 2X2 equal segments.
In an embodiment, the training module (210) is configured to use the standard U-Net architecture with the training images as input and the masks as target. Further, for training, each image segment may be converted into 1024X1024 sizes. Further, the masks may also be rescaled accordingly. Further during training, to avoid the overfitting, image augmentation may be done by random horizontal flips, random shift scale and rotate and random hue saturation values by the training module (210).
Further, in an embodiment, each image may be cut into 4 equal segments, and masks may be predicted for each segment. Finally, the masks may be stitched together to get the whole mask by the training module (210). In an embodiment, the training module (210) may feed mask into computer vision (CV) algorithms to calculate the dimensions of each grain.

FIG. 12 illustrates an exemplary flowchart of a grade assessment method of the grains, according to an embodiment of the present invention.
Initially, at step 1202, an image of grains on a tray with reference object is captured. In an embodiment, a sample of the grains may be spread on a predetermined background cloth having a different or contrasting color (for example, black, blue, or red color) than the grains. Further, the reference object may have a contrasting and bright color such as red, blue, green, or any contrasting color with respect to the grains, background tray and grid.
At step 1204, lighting variation for the captured image is compensated based on the reference object color variation. In an embodiment, the lighting variation may be compensated by calculating a degree of variation in color of the reference object, as captured from the already known color of the reference object, and applying this variation in the lighting to all the pixels of the captured image.
At step 1206, image processing is performed on the captured image to detect objects with boundaries and classification. According to an embodiment of the present invention, the image processing may include object detection, object segmentation, and feature extraction, as explained above.
At step 1208, it is determined whether image processing has been done on all objects in the captured image of the grain sample. If yes, the method proceeds to step 1210. Otherwise, the method returns to step 1206.
At step 1210, an output is provided as a grade assessment report of the grains, having accuracy of size, distribution, color and defects of the grains. In an embodiment, the report generated may provide size distribution as 99% accurate, color distribution as 95% accurate, external defect detection as 95% accurate, internal defect detection as 95% accurate, chemical and microbial profiling as 95% accurate. Those skilled in the art will appreciate that buyers, sitting at even remote locations, may utilize the grading to determine the price and quality of the grains.
The grain quality assessment system (102) and the method (1200) performed by the grain quality assessment system (102) advantageously provides quality assessment of the grains for the size distribution, color distribution, external defect detection, internal defect detection, chemical, and microbial profiling. Such grade assessment of the grains may be utilized by the buyers, to determine the price and quality of the produce. Further, the grain quality assessment system (102) advantageously provides for compensating for variation in lighting during image capturing by utilizing a reference object. Those skilled in the art will appreciate that such compensation of lighting facilitates uniform image capturing of the grains, helping the buyers to determine the quality of grains sample, and results in better utilization of values of the agriculture produce for farmers.

The foregoing discussion of the present invention has been presented for purposes of illustration and description. It is not intended to limit the present invention to the form or forms disclosed herein. In the foregoing Detailed Description, for example, various features of the present invention are grouped together in one or more embodiments, configurations, or aspects for the purpose of streamlining the disclosure. The features of the embodiments, configurations, or aspects may be combined in alternate embodiments, configurations, or aspects other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention the present invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment, configuration, or aspect.

Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of the present invention.

Moreover, though the description of the present invention has included description of one or more embodiments, configurations, or aspects and certain variations and modifications, other variations, combinations, and modifications are within the scope of the present invention, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative embodiments, configurations, or aspects to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.
,CLAIMS:WE CLAIM

1. A grain quality assessment system (102) comprising:
a processor (110) and a memory (112), the memory (112) storing: an image acquisition module (202) configured to capture image of the grains lying on a predetermined background tray with a reference object;
a lighting compensation module (204) configured to compensate for variation in lighting during image capturing of the grains, based on image of the reference object;
an image processing module (206) configured to process the captured image to perform object detection, object segmentation, and feature extraction for the rice grains;
and an output module (208) configured to output a grade assessment report about accuracy of size, color, and defect detection of the grain sample.

2. The grain quality assessment system (102) as claimed in claim 1, wherein a color of the background tray and grid is different than the color of the grain sample.

3. The grain quality assessment system (102) as claimed in claim 1, wherein a color of the reference object is different than the color of the grains and the color of the background tray and grid.

4. The grain quality assessment system (102) as claimed in claim 1, wherein the shape of the grid may include but not restricted to geometrical structure like square, rectangle, rhombus, circle, or any polygon and so forth.

5. The grain quality assessment system (102) as claimed in claim 1, wherein the lighting compensation module (204) is configured to calculate a degree of variation in color of the reference object, based on a known color of the reference object.

6. The grain quality assessment system (102) as claimed in claim 1, wherein the lighting compensation module (204) is configured to apply the degree of variation in color of the reference object to all pixels of the captured image of the rice grains.

7. The grain quality assessment system (102) as claimed in claim 1, wherein the image processing module (206) further comprising an object detection sub-module to detect objects of the grain sample.

8. The grain quality assessment system (102) as claimed in claim 1, wherein the image processing module (206) further comprising a feature extraction sub-module to calculate height, width, length, breadth, and volume of the grain sample.

9. The grain quality assessment system (102) of as claimed in claim 1, wherein the image processing module (206) further comprising an object classification sub-module to perform classification of each object in the agriculture produce for a defect type, a defect variant, detection of chemicals, and pathogens for food safety.

10. The grain quality assessment system (102) of as claimed in claim 1, further comprising a training module (210) configured to train the image processing using U-Net architecture.

11. A computer-implemented method for grade assessment of grains, the computer implemented method comprising:
capturing image of the grains lying on a predetermined tray with grid affixed on top of a and a reference object;
compensating for variation in lighting during image acquisition of the grains, based on image of the reference object;
processing the captured image to perform object detection, object segmentation, and feature extraction for the grains; and
providing a grade assessment report about accuracy of size, color, and defect detection of the grain sample.

Dated this the 16 th day of April 2020.

ANUGU VIJAYA BHASKAR REDDY
Agent for the Applicant (IN/PA-2420)

Documents

Application Documents

# Name Date
1 202021016456-PROVISIONAL SPECIFICATION [16-04-2020(online)].pdf 2020-04-16
2 202021016456-FORM FOR STARTUP [16-04-2020(online)].pdf 2020-04-16
3 202021016456-FORM FOR SMALL ENTITY(FORM-28) [16-04-2020(online)].pdf 2020-04-16
4 202021016456-FORM 1 [16-04-2020(online)].pdf 2020-04-16
5 202021016456-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [16-04-2020(online)].pdf 2020-04-16
6 202021016456-EVIDENCE FOR REGISTRATION UNDER SSI [16-04-2020(online)].pdf 2020-04-16
7 202021016456-DRAWINGS [16-04-2020(online)].pdf 2020-04-16
8 202021016456-FORM-26 [07-04-2021(online)].pdf 2021-04-07
9 202021016456-Proof of Right [10-04-2021(online)].pdf 2021-04-10
10 202021016456-FORM 3 [10-04-2021(online)].pdf 2021-04-10
11 202021016456-ENDORSEMENT BY INVENTORS [10-04-2021(online)].pdf 2021-04-10
12 202021016456-DRAWING [10-04-2021(online)].pdf 2021-04-10
13 202021016456-COMPLETE SPECIFICATION [10-04-2021(online)].pdf 2021-04-10
14 202021016456-FORM-26 [13-07-2021(online)].pdf 2021-07-13
15 202021016456-FORM 18 [20-07-2021(online)].pdf 2021-07-20
16 Abstract1.jpg 2021-10-19
17 202021016456-FER.pdf 2022-03-04
18 202021016456-Proof of Right [30-08-2022(online)].pdf 2022-08-30
19 202021016456-OTHERS [30-08-2022(online)].pdf 2022-08-30
20 202021016456-FORM 3 [30-08-2022(online)].pdf 2022-08-30
21 202021016456-FER_SER_REPLY [30-08-2022(online)].pdf 2022-08-30
22 202021016456-DRAWING [30-08-2022(online)].pdf 2022-08-30
23 202021016456-COMPLETE SPECIFICATION [30-08-2022(online)].pdf 2022-08-30
24 202021016456-CLAIMS [30-08-2022(online)].pdf 2022-08-30
25 202021016456-ABSTRACT [30-08-2022(online)].pdf 2022-08-30

Search Strategy

1 0105E_01-03-2022.pdf