Sign In to Follow Application
View All Documents & Correspondence

Seed Germination Evaluation System And Method Thereof

Abstract: The present invention provides an automated seedling evaluation system (100) that evaluates germination percentage for seeds/seedlings and categorizations the seeds/seedlings into normal, abnormal and other types of seeds/seedlings. The automated seedling evaluation system (100) includes a seed germination evaluation system (102) that receives an image of the seeds/seedlings, identifies features in the image, identifies one or more seeds/seedlings based on the identified features, labels the regions of interest, determines one or more bounding boxes around the identified seeds/seedlings based on the labels, calculates confidence scores for the bounding boxes, and determines a germination percentage based on the seed/seedling featured inside the bounding boxes and the corresponding confidence scores. Features facilitate estimation of vigor index (combination of average seedling length and germination percentage) and uniformity of the seed/seedlings. The seed germination evaluation system (102) operates independently without requiring any manual intervention. The seed germination evaluation system (102) provides consistent and reliable results with improved accuracy. Reference Figure: FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 May 2023
Publication Number
46/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Mahyco Private Limited
Raj Mahal, 84 Veer Nariman Road, Churchgate, Mumbai – 400 020, Maharashtra, India

Inventors

1. Smitha V. Kurup
Flat No. 2BHK-B Mahyco Housing Colony, Dawalwadi, Jalna- 431203, Maharashtra, India
2. S. D. Magadi
#126, Behind Hanuman Temple, Shirahatti Taluk, Gadag District, Bellatti Post, Karnataka 582112, India
3. Abhijeet B. Shillak
Swarajya, Plot No. 38, Tornagadnagar, CIDCO N-2, opp Dhoot Hospital, Aurangabad-431006, Maharashtra, India
4. Bharat R. Char
RH -2D, Mahyco Housing Colony, Dawalwadi, Jalna- 431203, Maharashtra, India
5. Prasad Kumar H.M.
1BHK -C , Mahyco Housing Colony, Dawalwadi, Jalna- 431203, Maharashtra, India

Specification

DESC:FIELD OF INVENTION
[0001] The present invention relates generally to intelligent image processing techniques and specifically to automated seed/seedling evaluation.

BACKGROUND
[0002] In agriculture and gardening, seeds are sown in soil to grow crops. But not all the seeds grow into crops. The seeds that fail to germinate can impact the final yield of the crop. To assess seed quality or viability and to predict performance of the seed and seedling in the field it is essential to carry out seeds germination test. The ultimate aim of testing the germination is to obtain information about the planting value of the seed sample and by inference the quality of the seed lot. A germination test determines the percentage of seeds that are alive in any seed lot and is the most widely used index to determine seed quality. The level of germination in association with seed/ seedlings vigor provides a very good estimate of the potential field performance.
[0003] In the most commonly performed seeds germination test, experts are appointed to manually estimate germination percentage. However, this is a very tedious and time-consuming process. The reliability of the method depends on the skillful individuals and is prone to human errors and hence, the estimated germination percentage may vary a few percentage-points off from the actual germination percentage of the seeds. The testing skills differ for every person and since multiple persons are employed to test the seeds and determine the germination percentages over the course of time, the testing process lacks consistency and reliability.
[0004] Seed manufacturers often print the germination percentage on bags of seeds. For this, the germination tests are performed on a large scale on the entire lot of seeds. For such large scale testing, the amount of manual labor required for examination of the seeds is enormous and the process of testing is very time consuming.
[0005] In another conventional seeds germination test, a batch of seeds are chosen from a bigger lot of seeds and are tested manually for germination. This result is extrapolated to determine the germination percentage of the entire lot.
[0006] Further, seed vigor assessed through seedling length measurement holds significant importance in the seed industry. Seedling length is a direct indicator of the seed's ability to germinate and establish healthy seedlings quickly. Longer seedlings suggest higher energy reserves and better physiological vigor, enabling them to emerge from the soil and establish a strong root system early in the growing process. Conventional approaches for seedling length measurement widely used in the seed industry due to their simplicity and accessibility, however, have limitations related to subjectivity, randomness of small set, human error, labor intensity, and throughput capacity. As a result, there is a growing interest in integrating automated systems and digital imaging technologies to overcome these limitations and improve the efficiency and accuracy of seed vigor assessment.
[0007] Thus, there is a need for a reliable germination test that provides quick and accurate germination results including seedling vigor and uniformity to facilitate vital decision making for seed dispatch based on market needs.

SUMMARY
[0008] This summary is provided to introduce concepts related to a seed germination evaluation system and a seed germination evaluation method.
[0009] This summary is neither intended to identify essential features of the present invention nor is it intended for use in determining or limiting the scope of the present invention.
[0010] In an embodiment of the present invention, a seed germination evaluation system is provided. The seed germination evaluation system includes an image capturing unit, a light source, and an evaluation unit. The image capturing unit is coupled to the light source and in combination with the light source is configured to capture images of a plurality of seed/seedlings placed on one or more germination sheets which are placed on a conveyor belt. The evaluation unit includes an image processing unit, an image labelling unit, a feature extraction unit, and a seed classification unit. The image processing unit is configured to receive an image indicative of the plurality of seeds/seedlings. The image processing unit identifies one or more features in the received image. The image processing unit generates an enhanced image including the identified features. The image labelling unit is configured to receive the enhanced image. The image labelling unit identifies one or more seeds/seedlings based on the identified features. The image labelling unit determines one or more regions of interest in the enhanced image. The image labelling unit labels the regions of interest to generate a labelled image. The feature extraction unit is configured to receive the labelled image. The feature extraction unit determines one or more bounding boxes around the identified seeds/seedlings in the labelled image. The feature extraction unit calculates confidence scores for the bounding boxes. The seed classification unit is configured to determine a germination percentage based on the seed/seedling featured inside the bounding boxes and the corresponding confidence scores.
[0011] In another embodiment of the present invention, a seed germination evaluation method is provided. The seed germination evaluation method includes capturing images of a plurality of seed/seedlings placed on one or more germination sheets, wherein the one or more germination sheets are placed on a conveyor belt, identifying one or more features in a received image indicative of a plurality of seeds/seedlings by an image processing unit. The method includes generating an enhanced image including the identified features by the image processing unit. The method includes identifying one or more seeds/seedlings in the enhanced image based on the identified features by an image labelling unit. The method includes determining one or more regions of interest in the enhanced image by the image labelling unit. The method includes labelling the regions of interest to generate a labelled image by the image labelling unit. The method includes determining one or more bounding boxes around the identified seeds/seedlings in the labelled image by a feature extraction unit. The method includes calculating confidence scores for the bounding boxes by the feature extraction unit. The method includes determining a germination percentage based on the seed/seedling featured inside bounding boxes and the corresponding confidence scores by a seed classification unit.
[0012] In an exemplary embodiment, the seed germination evaluation system is in communication with a seed database, and wherein the seed database stores features of different classes of seeds/seedlings.
[0013] In another exemplary embodiment, the feature extraction unit, in a training phase, is configured to identify the features from the labelled image using Convolution Neural Network (CNN) techniques. The feature extraction unit trains a machine learning model based on the identified features and the corresponding labels, and the features of the different classes of seeds/seedlings. The feature extraction unit stores the features, the labels, and the corresponding classes of seeds/seedlings in a memory.
[0014] In another exemplary embodiment, the seed germination evaluation system includes a probability calculation unit, and the probability calculation unit along with the seed classification unit, in training phase, are configured to calculate an error in the identified features and classes of seeds/seedlings. The probability calculation unit minimizes the error using the CNN techniques. The probability calculation unit compares the error with a predetermined threshold error. The seed classification unit provides the identified features, classes, bounding boxes, and the confidence scores for the bounding boxes when the error is less than the predetermined threshold error.
[0015] In another exemplary embodiment, the features identified by the feature extraction unit also facilitate estimation of vigor index and uniformity of the seed/seedlings.
[0016] In another exemplary embodiment, the seed germination evaluation system includes a weight calculation unit configured to receive the identified features, classes, bounding boxes, and the confidence scores for the bounding boxes from the feature extraction unit. The weight calculation unit calculates one or more weights for the identified features and classes. The weight calculation unit stores the weights in the memory.
[0017] In another exemplary embodiment, the image labelling unit is configured to sample the enhanced image into NxN grid.
[0018] In another exemplary embodiment, the image processing unit is configured to filter noise from the image.

BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0019] The detailed description is described with reference to the accompanying figures.
[0020] FIG. 1 illustrates a schematic architectural diagram of an automated seedling evaluation system in accordance with an embodiment of the present invention.
[0021] FIG. 2 illustrates a schematic block diagram of a seed germination evaluation system in accordance with an embodiment of the present invention.
[0022] FIG. 3 illustrates a flowchart for a method of automated seedling evaluation in accordance with an embodiment of the present invention.
[0023] FIG. 4 illustrates a flowchart for a method of automated seedling evaluation in accordance with an embodiment of the present invention.
[0024] FIG. 5 illustrates a flow diagram for training and testing methods of automated seedling evaluation in accordance with an embodiment of the present invention.
[0025] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present invention. Similarly, it will be appreciated that any flow chart, flow diagram, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.

DETAILED DESCRIPTION
[0026] The various embodiments of the present invention provide a seed germination evaluation system and a seed germination evaluation method.
[0027] In the following description, for purpose of explanation, specific details are set forth in order to provide an understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these details.
[0028] One skilled in the art will recognize that embodiments of the present invention, some of which are described below, may be incorporated into a number of systems.
[0029] However, the systems and methods are not limited to the specific embodiments described herein. Further, structures and devices shown in the figures are illustrative of exemplary embodiments of the present invention and are meant to avoid obscuring of the present invention.
[0030] Furthermore, connections between components and/or modules within the figures are not intended to be limited to direct connections. Rather, these components and modules may be modified, re-formatted or otherwise changed by intermediary components and modules.
[0031] The appearances of the phrase “in an embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
[0032] Referring now to FIG. 1, a schematic architectural diagram of an automated seedling evaluation system (100) is shown in accordance with an embodiment of the present invention. The automated seedling evaluation system (100) includes a seed germination evaluation system (102), a communication network (104), a seed database (106), and a cloud server (108). The seed germination evaluation system (102) includes an evaluation unit (110) connected to a light source (112) and an image capturing unit such as a camera (114). Seeds/Seedlings are placed on one or more germination sheets (116). Optionally, the germination sheets (116) may be placed on a conveyor belt (118).
[0033] The automated seedling evaluation system (100) evaluates germination percentage for the seeds, and categorizations the seeds into normal, abnormal and other types of seedlings.
[0034] The conveyor belt (118) may be operated suitably such that the camera (114) obtains sufficient number of images of each germination sheet (116). The germination sheet (116) includes one or more seeds/seedlings that may be at any stage of the germination process or may even be non-germinated seeds/seedlings. The germination sheet may be a conventional germination paper or any advanced germination base that is used for germination purposes. The camera (114) may include Charge-Coupled Device (CCD) or Complementary Metal-Oxide-Semiconductor (CMOS) sensors for capturing the images of the seedlings in the germination sheets (116). The light source (112) may be Light Emitting Diode (LED). The camera (114) may be connected to the evaluation unit (110) by wired connection and/or wireless connection.
[0035] In an example, the camera (114) may be an industrial machine vision camera with a lens mounted at approximately 60 cm from the germination sheet (116) having the seedlings. The camera (114) may have a resolution of 20 MP in RGB color space. The camera (114) is coupled with the light source (112) to provide illumination to have uniform lighting across the germination sheet (116), thereby avoiding influence of any external lighting effects while capturing the images of the seed/seedlings. The camera (114) may also be scaled up to include an array of machine vision cameras for high-throughput purpose to increase the throughput from 50 seeds/seedlings per germination sheet to 100 seeds/seedlings per germination sheet. In an example, distance between the camera (114) and the germination sheet (116) varies based on the type of camera used. For imaging more number of seeds/seedlings, the distance may be increased to provide a wide field of view. In another example, an array of wide angle cameras may be used to image the seeds/seedlings.
[0036] The seed evaluation germination system (102) is in communication with the seed database (106) and the cloud server (108) by way of the communication network (104). The communication network (104) may be a wired network or a wireless network. Examples of the communication network (104) include, but are not limited to, Local Area Network (LAN), Ethernet, Long-Term Evolution (LTE), LTE-A, Wi-Fi, optical fiber, etc. The seed database (106) may store information about the seeds/seedlings, such as, but not limited to, types of seeds/seedlings, number of seeds/seedlings, number of batches/lots of seeds/seedlings, number of seeds/seedlings per batch/lot, germination percentage for each batch/lot, overall germination percentage for all the seeds/seedlings, etc.
[0037] In an embodiment, the calculated germination percentage is provided by the seed germination evaluation system (102) to the cloud server (108). That is, in this embodiment, the automated seedling evaluation is performed by the seed germination evaluation system (102).
[0038] In another embodiment, the seed germination evaluation system (102) provides the images of the seedling to the cloud server (108). The cloud server (108) processes the images to determine the germination percentage of the seeds/seedlings. That is, in this embodiment, the automated seedling evaluation is performed by the cloud server (108).
[0039] Referring now to FIG. 2, a schematic block diagram of the evaluation unit (110) is shown in accordance with an embodiment of the present invention. The evaluation unit (110) includes a processor (202), a memory (204), an Input/Output unit (206), an image processing unit (208), an image labelling unit (210), and a Convolution Neural Network (CNN) architecture (211). The CNN architecture (211) includes a feature extraction unit (212), a weight calculation unit (214), a probability calculation unit (216), and a seed germination unit (218).
[0040] The memory (204) is configured to store one or more computer readable instructions that when executed by the processor (202) cause the seed germination evaluation system (102) to perform the automated seedling evaluation technique of the present invention. The processor (202) may be implemented by logical devices, including logical circuits such as programmable gate array (PGA), FPGA, or programmable logic device (PLD). The I/O unit (206) is configured to interface with the camera (114) and the communication network (104). The units (208-218) are configured to perform the automated seedling evaluation technique of the present invention. In an example, the automated seedling evaluation technique is fully automated and requires no manual intervention.
[0041] Referring now to FIG. 3, a flowchart for a method of automated seedling evaluation is shown in accordance with an embodiment of the present invention.
[0042] At step 302, the seed germination evaluation system (102) obtains an image of the seedlings in the germination sheet (116) from the camera (114). Here, the obtained image may be in JPG, PNG, or RAW formats. In an example, the image is in RGB color space.
[0043] At step 304, the image processing unit (208) processes the received image. Here, the image processing unit (208) filters the noises.
[0044] At step 306, the image labelling unit (210) detects presence of one or more seedlings in the enhanced image and determines regions of interests in the enhanced image. The image labeling unit (210) labels the regions of interest and stores the regions of interest and the corresponding labels in the memory (204).
[0045] In an example, initially, labelling for first time image obtained by the camera (114) is done manually by annotating the seedlings with appropriate labels. Thereafter, the evaluation unit (110) is trained to perform image labeling. Optionally, manual intervention may be used to fine-tune the image labeling performed by the evaluation unit (110) by providing feedback.
[0046] At step 308, the feature extraction unit (212) extracts the features from the enhanced images using Convolution Neural Network (CNN) techniques. The feature extraction unit (212) also trains a machine learning model based on the features extracted on labeled regions of interest, and the corresponding classes of the seedlings. In an example, the feature extraction unit (212) stores the features, regions of interest, and the corresponding classes of the seedlings as a training dataset in the memory (204).
[0047] At step 310, the probability calculation unit (216) along with the seed classification unit (218) calculates errors in the identified features and classes and minimizes the identified errors.
[0048] At step 312, the weight calculation unit (214) calculates or updates one or more weights to minimize the errors for the identified features and classes.
[0049] At step 314, the probability calculation unit (216) along with the seed classification unit (218) determines that the error is higher or lesser than a minima i.e., a predetermined threshold error, and recurringly updates the weights till the error is minimal for given set of training images.
[0050] If the probability calculation unit (216) along with the seed classification unit (218) determines that the error is less and has reached a minima, the weight calculation unit (214) executes step 316.
[0051] At step 316, the weight calculation unit (214) stores the calculated or updated weights in the memory (204) for use in next run.
[0052] In an exemplary embodiment, the features identified by the feature extraction unit (212) also facilitate estimation of vigor index and uniformity of the seed/seedlings. The feature extraction unit (212) identifies the features of the seedlings based on edge detection techniques and fits contours for all the seedlings based on the identified edge features.
[0053] The feature extraction unit (212) matches the labelled regions of interest stored in the memory by the image labelling unit (210) with the fitted contours and labels the contours, and also measures the length of each labelled contour.
[0054] Further, calibration information of the camera (114) can be used to determine the actual length of each seed/seedling. Based on the length of each labelled contour and the actual length of each seed/seedling corresponding to the labelled contour, and the germination percentage, the vigor index and uniformity of the seed/seedling can be estimated.
[0055] Referring now to FIG. 4, a flowchart for a method of automated seedling evaluation is shown in accordance with an embodiment of the present invention.
[0056] At step 402, the seed germination evaluation system (102) obtains the image of the seedlings on the germination sheet (116) from the camera (114).
[0057] At step 404, the image processing unit (208) processes the image to filter noise from the image and generate the enhanced image.
[0058] At step 406, the image labelling unit (210) samples the image into N x N grid to generate a sampled image.
[0059] At step 408, the image labelling unit (210) provides the sampled image to the feature extraction unit (212).
[0060] At steps 410 and 412, the probability calculation unit (216) along with the seed classification unit (218) determines bounding boxes around the seedlings depicted in the sampled image and calculates confidence scores for the bounding boxes.
[0061] At step 414, the probability calculation unit (216) along with the seed classification unit (218) selects the bounding box with highest confidence score.
[0062] At step 416, the seed classification unit (218) compares the overlapped bounding boxes.
[0063] At step 418, the seed classification unit (218) removes the bounding boxes having more than 50% intersection.
[0064] At step 420, the seed classification unit (218) determines if a predetermined number (K) of iterations are over.
[0065] If the seed classification unit (218) determines that the predetermined number of iterations are not over, the feature extraction unit (212) executes step 412.
[0066] If the seed classification unit (218) determines that the predetermined numbers of iterations are over, the weight calculation unit (214) executes step 422.
[0067] At step 422, the seed classification unit (218) determines final confidence scores for the bounding boxes.
[0068] At step 424, the seed classification unit (218) determines the germination percentage. The seed classification unit (218) also classifies the seeds/seedlings into at least two classes, viz, normal seeds/seedlings and abnormal seeds/seedlings.
[0069] At step 426, the seed germination evaluation system (102) stores the germination percentage in the memory (204) and the seed database (106).
[0070] FIG. 5 illustrates a flow diagram for training and testing methods of automated seedling evaluation in accordance with an embodiment of the present invention.
[0071] In an embodiment, the seed germination evaluation system (102) operates in training mode or inference mode. The seed germination evaluation system (102) may be operated in the training mode once at the time of initialization and thereafter, the seed germination evaluation system (102) may operate in the inference mode.
[0072] In the training mode, multiple labeled images of seeds/seedlings are used for training a machine learning model. These images may include multiple seedling images of normal, abnormal, and other classes. One or more feature maps are extracted and are labelled based on these images. These feature maps and labels are then used to train the machine learning model. In an example, the machine learning model is trained to detect and distinguish between fully germinated seeds/seedlings and partially germinated seeds/seedlings. In an example, the machine learning model is trained to classify the seeds/seedlings into classes and sub-classes based on one or more predetermined key features.
[0073] The training mode of the seed germination evaluation system (102) is shown in 500A. In the training mode, the seed germination evaluation system (102) receives the image of the germination sheet (116) from the camera (114) using the I/O unit (206). At 502, the image processing unit (208) processes the image to enhance one or more features of the image and generate the enhanced image. At 504, the image labelling unit (210) extracts one or more regions of interest and labels the regions of interest to generate the labeled image. At 506, the probability calculation unit (216) along with seed classification unit (218) uses CNN to extract features for each class and trains the machine learning model based on the features extracted for each class. At 508, the weight calculation unit (214) determines the weights learned in the training process and stores the learned weights and the corresponding classes in the memory (204) for use in the inference mode.
[0074] In the inference mode, the image is divided into ‘N’ number of overlapping rectangular grids of varying sizes. Each rectangular grid is passed through CNN for label prediction. The CNN obtains bounding boxes around the seedlings. Based on grid scores, each seedling is mapped to one or more labels. Each seedling has multiple grid predictions which are then filtered using non-maximum suppression technique for single best prediction. In an example, if there is an overlap of two or more different classes for a single seedling, specific scores are used to decide the class of seedling. In an example, the machine learning model classifies the seedlings into normal, abnormal, and other classes and/or sub-classes. In an example, the gemination percentage is calculated considering all replicas. In an example, multiple counting may be scheduled for the same lot of seeds/seedlings. In another example, only one counting is scheduled for each lot of seeds/seedlings. In an example, a user may set an acceptance parameter for passing of a lot of seeds/seedlings. In an example, the germination percentage information is seamlessly synchronized with the seed database (106).
[0075] In one approach to calculate the germination percentage, the machine learning model keeping the seedlings intact across multiple counting schedules. In another approach to calculate the germination percentage, the machine learning model removes normal seedlings during multiple counting schedules.
[0076] The inference mode of the seed germination evaluation system (102) is shown in 500B. In the inference mode, the seed germination evaluation system (102) receives the image of the germination sheet (116) from the camera (114) using the I/O unit (206). At 510, the image processing unit (208) enhances the image and samples the image into NxN grid to generate the enhanced image. At 512, the feature extraction unit (212) uses CNN with the weights stored in the memory (204) to determine the bounding boxes around the seedlings depicted in the enhanced image and to calculate the confidence score for each bounding box. At 514, the probability calculation unit (216) maps the classes for the seedlings in the bounding boxes based on comparison between the determined confidence scores and the weights stored in the memory (204). At 516, the seed classification unit (218) classifies the seedlings as germinated or non-germinate and thereafter calculates the germination percentage for the seeds/seedlings.
[0077] Advantageously, the seed germination evaluation system (102) operates independently without requiring any manual intervention. The seed germination evaluation system (102) provides consistent and reliable results with improved accuracy. The seed germination evaluation system (102) can be used continuously without interruption. In an example, the seed germination evaluation system (102) reduces the time required for seed evaluation and classification by half, when compared to manual methods of seed evaluation and classification.
[0078] The foregoing description of the invention has been set merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to person skilled in the art, the invention should be construed to include everything within the scope of the invention.

,CLAIMS:

1. A seed germination evaluation system (102) comprising:
an image capturing unit (114) coupled to a light source (112), the image capturing unit (114) in combination with the light source (112) configured to capture images of a plurality of seed/seedlings placed on one or more germination sheets (116), wherein the one or more germination sheets are placed on a conveyor belt (118);
an evaluation unit (110) connected to the image capturing unit (114) and the light source (112), the evaluation unit comprising:
an image processing unit (208) configured to:
receive an image indicative of the plurality of seeds/seedlings,
identify one or more features in the received image, and
generate an enhanced image including the identified features;
an image labelling unit (210) configured to:
receive the enhanced image,
identify one or more seeds/seedlings based on the identified features,
determine one or more regions of interest in the enhanced image, and
label the regions of interest to generate a labelled image;
a feature extraction unit (212) configured to:
receive the labelled image,
determine one or more bounding boxes around the identified seeds/seedlings in the labelled image, and
calculate confidence scores for the bounding boxes; and
a seed classification unit (218) configured to determine a germination percentage based on the seed/seedling featured inside the bounding boxes and the corresponding confidence scores.

2. The seed germination evaluation system (102) as claimed in claim 1, wherein the seed germination evaluation system (102) is in communication with a seed database (106), and wherein the seed database (106) stores features of different classes of seeds/seedlings.

3. The seed germination evaluation system (102) as claimed in claim 2, wherein the feature extraction unit (212), in a training phase, is configured to:
identify the features from the labelled image using Convolution Neural Network (CNN) techniques,
train a machine learning model based on the identified features and the corresponding labels, and the features of the different classes of seeds/seedlings, and
store the features, the labels, and the corresponding classes of seeds/seedlings in a memory (204).

4. The seed germination evaluation system (102) as claimed in claim 3, wherein the seed germination evaluation system (102) includes a probability calculation unit (216), and wherein the probability calculation unit (216) along with the seed classification unit (218), in training phase, are configured to:
calculate an error in the identified features and classes of seeds/seedlings,
minimize the error using the CNN techniques,
compare the error with a predetermined threshold error, and
provide the identified features, classes, bounding boxes, and the confidence scores for the bounding boxes when the error is less than the predetermined threshold error.

5. The seed germination evaluation system (102) as claimed in claim 3, wherein the features facilitate estimation of vigor index and uniformity of the seed/seedlings.

6. The seed germination evaluation system (102) as claimed in claim 4, wherein the seed germination evaluation system (102) includes a weight calculation unit (214) configured to:
receive the identified features, classes, bounding boxes, and the confidence scores for the bounding boxes from the feature extraction unit (212),
calculate one or more weights for the identified features and classes, and
store the weights in the memory (204).

7. The seed germination evaluation system (102) as claimed in claim 1, wherein the image labelling unit (210) is configured to sample the enhanced image into NxN grid.

8. The seed germination evaluation system (102) as claimed in claim 1, wherein the image processing unit (208) is configured to filter noise from the image.

9. A seed germination evaluation method, comprising:
capturing, by an image capturing unit (114) in combination with a light source (112) images of a plurality of seed/seedlings placed on one or more germination sheets (116), wherein the one or more germination sheets are placed on a conveyor belt (118);
identifying, by an image processing unit (208), one or more features in a received image indicative of a plurality of seeds/seedlings;
generating, by the image processing unit (208), an enhanced image including the identified features;
identifying, by an image labelling unit (210), one or more seeds/seedlings in the enhanced image based on the identified features;
determining, by the image labelling unit (210), one or more regions of interest in the enhanced image;
labelling, by the image labelling unit (210), the regions of interest to generate a labelled image;
determining, by a feature extraction unit (212), one or more bounding boxes around the identified seeds/seedlings in the labelled image;
calculating, by the feature extraction unit (212), confidence scores for the bounding boxes; and
determining, by a seed classification unit (218), a germination percentage based on the seed/seedling featured inside bounding boxes and the corresponding confidence scores.

10. The seed germination evaluation method as claimed in claim 9, comprising storing, in a seed database (106), features of different classes of seeds/seedlings.

11. The seed germination evaluation method as claimed in claim 10, in training phase, comprising:
identifying, by the feature extraction unit (212), the features from the labelled image using Convolution Neural Network (CNN) techniques;
training, by the feature extraction unit (212), a machine learning model based on the identified features and the corresponding labels, and the features of the different classes of seeds/seedlings; and
storing, by the feature extraction unit (212), the features, the labels, and the corresponding classes of seeds in a memory (204).

12. The seed germination evaluation method as claimed in claim 11, in training phase, comprising:
calculating, by a probability calculation unit (216) along with the seed classification unit (218), an error in the identified features and classes of seeds;
minimizing, by the probability calculation unit (216) along with the seed classification unit (218), the error using the CNN techniques; and
comparing, by the probability calculation unit (216) along with the seed classification unit (218), the error with a predetermined threshold error.

13. The seed germination evaluation method as claimed in claim 11, wherein the features also facilitate estimation of vigor index and uniformity of the seed/seedlings.

14. The seed germination evaluation method as claimed in claim 12, comprising:
receiving, by a weight calculation unit (214), the identified features, classes, bounding boxes, and the confidence scores for the bounding boxes from the feature extraction unit (212);
calculating, by the weight calculation unit (214), one or more weights for the identified features and classes; and
storing, by the weight calculation unit (214), the weights in the memory (204).

15. The seed germination evaluation method as claimed in claim 9, comprising sampling, by the image labelling unit (210), the enhanced image into NxN grid.

16. The seed germination evaluation method as claimed in claim 9, comprising filtering noise from the image by the image processing unit (208).

Documents

Application Documents

# Name Date
1 202321033309-PROVISIONAL SPECIFICATION [11-05-2023(online)].pdf 2023-05-11
2 202321033309-POWER OF AUTHORITY [11-05-2023(online)].pdf 2023-05-11
3 202321033309-FORM 1 [11-05-2023(online)].pdf 2023-05-11
4 202321033309-DRAWINGS [11-05-2023(online)].pdf 2023-05-11
5 202321033309-FORM-26 [10-08-2023(online)].pdf 2023-08-10
6 202321033309-Proof of Right [10-11-2023(online)].pdf 2023-11-10
7 202321033309-FORM 3 [10-05-2024(online)].pdf 2024-05-10
8 202321033309-ENDORSEMENT BY INVENTORS [10-05-2024(online)].pdf 2024-05-10
9 202321033309-DRAWING [10-05-2024(online)].pdf 2024-05-10
10 202321033309-CORRESPONDENCE-OTHERS [10-05-2024(online)].pdf 2024-05-10
11 202321033309-COMPLETE SPECIFICATION [10-05-2024(online)].pdf 2024-05-10
12 202321033309-REQUEST FOR CERTIFIED COPY [20-05-2024(online)].pdf 2024-05-20
13 202321033309-CORRESPONDENCE(IPO)-(CERTIFIED LETTER)-28-05-2024.pdf 2024-05-28
14 202321033309-Form 1 (Submitted on date of filing) [13-06-2024(online)].pdf 2024-06-13
15 202321033309-Covering Letter [13-06-2024(online)].pdf 2024-06-13
16 202321033309-CERTIFIED COPIES TRANSMISSION TO IB [13-06-2024(online)].pdf 2024-06-13
17 202321033309-FORM 3 [18-06-2024(online)].pdf 2024-06-18
18 Abstract.1.jpg 2024-06-22
19 202321033309-FORM 18 [10-12-2024(online)].pdf 2024-12-10