Sign In to Follow Application
View All Documents & Correspondence

A System For Classifying Coconuts Based On Maturity And Estimating Flesh And Water Content

Abstract: ABSTRACT: Title: A system for classifying coconuts based on maturity and estimating flesh and water content The present invention describes a system for classifying coconuts based on maturity and estimating flesh and water content. The system integrates an image capture device, deep learning module, RF analysis module, and processor, all underpinned by machine learning algorithms for ongoing improvement. It captures coconut images, extracts critical visual features, and simultaneously measures dielectric properties through RF analysis. These data sources are merged to provide precise classifications and content estimates. The present invention enhances accuracy over time through iterative learning, making it suitable for applications such as quality control, product development, and coconut research.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
02 September 2023
Publication Number
14/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SEVENTH HORSE AGRI TECH PRIVATE LIMITED
MATIC, 3/458, K N Pudur Post, Kadayampatti, Salem – 636354 (Tamil Nadu, India)

Inventors

1. Dr. Shanmugakumar Murugesan
MATIC, 3/458, K N Pudur Post, Kadayampatti, Salem – 636354 (Tamil Nadu, India)
2. B. A. Naveen Kumar
MATIC, 3/458, K N Pudur Post, Kadayampatti, Salem – 636354 (Tamil Nadu, India)
3. Manohar Reddy T M
MATIC, 3/458, K N Pudur Post, Kadayampatti, Salem – 636354 (Tamil Nadu, India)
4. D. Naga Siva Sai Ravikanth
MATIC, 3/458, K N Pudur Post, Kadayampatti, Salem – 636354 (Tamil Nadu, India)
5. A. V. Sai Kumar Reddy
MATIC, 3/458, K N Pudur Post, Kadayampatti, Salem – 636354 (Tamil Nadu, India)

Specification

Description:DESCRIPTION OF INVENTION
FIELD OF INVENTION
The present invention generally relates to coconut maturity stage identification and estimation of the flesh & water content;
Particularly, the present invention relates to system and a method for classifying coconuts using a combination (fusion) of visual perception analysis with deep learning in computer vision, and RF analysis (dielectric characterization) to classification model for estimation of maturity, flesh and water content in a coconut.
BACKGROUND OF THE INVENTION
Coconuts are a crucial agricultural commodity with a protracted maturation period, making the timing of their harvest a vital factor in determining their quality and yield. The maturity stage of a coconut is influenced by several complex factors, including the sweetness of the coconut water, the quantity of coconut meat, the development of the Testa (the inner skin of the coconut kernel), and the physical characteristics of the husk and shell, including hardness.
The value-added products derived from coconuts encompass a wide array of items, totaling more than sixty distinct by-products. The production of these coconut value-added products necessitates coconuts at precise maturity stages. However, discerning the exact stage of coconut maturity poses a formidable challenge for human operators, contributing to significant wastages, estimated to be around 30% in coconut processing industries. This not only impacts production efficiency but also disrupts the entire coconut supply chain.
Conventionally, coconut maturity assessment relies on manual inspection conducted by skilled and trained workers. However, this method is inherently time-consuming, labor-intensive, and subject to the inherent potential for human error. Harvesters traditionally relied on visual cues, tapping sounds, and the observation of growth characteristics as coconuts matured.
Coconuts are commonly harvested by judging their maturity based on appearance, timeframe, tapping sound, and other growth characteristics of changes as they grow.
Presently, the coconut industry is grappling with the need for more reliable and efficient methods of maturity assessment. While image-processing techniques have been explored, they face substantial challenges in achieving accurate and consistent identification of coconut maturity stages.
The present invention describes a system for classifying coconuts based on maturity and estimating flesh and water content.
OBJECT OF THE INVENTION
The primary objective of the present invention is to develop a method that can accurately and consistently identify the maturity stage of a coconut;
Further objective of the present invention is to improve the efficiency and reliability of the coconut maturity stage identification process, and provide useful information for optimizing the harvest and improving the yield of coconut.
SUMMARY OF THE INVENTION
Embodiments of the present disclosure present technological improvements as solution to one or more of the above-mentioned technical problems recognized by the inventor in conventional practices and existing state of the art.
The present disclosure seeks to provide a system for classifying coconuts based on maturity and estimating flesh and water content.
In accordance with an aspect of the present invention, the present system combines visual perception analysis through deep learning models with Radio Frequency (RF) analysis, revolutionizing the coconut classification process. An image capture device captures coconut images, which are then processed to extract critical visual features such as shape, size, color, and texture. Simultaneously, the RF analysis module measures dielectric properties, including permittivity, loss tangent, and reflection coefficient, providing insights into coconut maturity.
According to further aspect of the present invention, a central processor integrates the results from both analyses to yield precise classifications based on maturity and accurate estimations of flesh and water content. This system is equipped with machine learning algorithms that continuously enhance accuracy through iterative learning, ensuring reliability over time.
The potential applications of this invention span a wide range, including quality control, the development of value-added coconut products, reduced wastage, diversified product offerings, and advancements in coconut research. Ultimately, this innovation promises to revolutionize the coconut industry by enhancing efficiency, sustainability, and product quality, meeting market demands for precisely classified coconuts.
The objects and the advantages of the invention are achieved by the process elaborated in the present disclosure.
DETAILED DESCRIPTION OF THE INVENTION
The following detailed description illustrates embodiments of the present disclosure and ways in which the disclosed embodiments can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
The present disclosure provides a system for classifying coconuts based on maturity and estimating flesh and water content in non-destructive manner.
The present invention discusses a system and a method for classifying coconuts using a combination of visual perception analysis with machine vision technique using deep learning model, and RF analysis (dielectric characterization). The method involves testing coconuts using these techniques to collect detailed information about the fruit, such as its variety, dielectric property, and using this information to make a cumulative decision on the classification of the coconut. The method achieves maximum accuracy and exceeds human-level performance in coconut classification. The method also includes the use of edge devices to perform predictive analytics on the collected data, and the use of machine learning algorithms to improve model accuracy through a feedback [MLOps (Machine Learning Operations) or a data-centric approach] mechanism.
The present invention relates to a system for classifying coconuts based on their maturity and estimating their flesh and water content. The system comprises a device for capturing images of the coconuts, a deep learning module for performing visual perception analysis on the captured images, and an RF analysis module (dielectric characterization) for measuring the dielectric properties of the coconut.
The system also includes a processor for combining the results of the visual perception analysis, and RF analysis to classify the coconuts based on their maturity and estimate the flesh and water content. The system is characterized by edge devices performing predictive analytics on the collected data, and machine learning algorithms for improving model accuracy through a feedback [MLOps (Machine Learning Operations) or a data-centric approach] mechanism.
In operation, the device captures images of the coconuts and sends them to the deep learning module for analysis. The deep learning module uses trained model to identify key features of the coconuts, such as their shape, size, color, and texture, and to classify them based on their maturity and variety.
The RF (radio frequency) analysis module measures the dielectric properties of the coconut, such as its permittivity, loss tangent, and reflection coefficient to provide additional information on its maturity and flesh and water content. Permittivity is a measure of a material's ability to store electrical energy in an electric field. It can vary with the composition and moisture content of the material. The loss tangent is a dimensionless number that describes the amount of energy lost as heat when electromagnetic waves pass through a material. It is often related to the material's moisture content and can provide insights into the internal structure of the coconut. The reflection coefficient measures how much of the incident RF signal is reflected when it encounters the surface of the coconut. Changes in the reflection coefficient can be indicative of variations in the coconut's dielectric properties.
By analyzing these dielectric properties using RF analysis, the present system gathers information about the coconut’s internal characteristics, which is used for classifying coconuts based on their maturity and estimating their flesh and water content. The processor then combines the results of the visual perception analysis, and RF analysis to classify the coconuts based on their maturity and estimate their flesh and water content and contributes to accurate coconut classification without damaging the fruit.
The machine learning model in the system can be trained and improved over time through a feedback [MLOps (Machine Learning Operations) or a data-centric approach] mechanism, ensuring that the system becomes more accurate and reliable with use.
In one embodiment, the system further comprises a memory for storing the image processing model and the classification model. The system may also include a user interface for allowing a user to view the classification result and adjust the image processing and classification models.
In another embodiment, the system may include a regression model for estimating the water and flesh content of the coconut based on the extracted features. The regression model may be adjusted based on previously obtained and labeled coconut data to improve the accuracy of the estimation.
In operation, the imaging device captures an image of a coconut and the processor executes the deep learning model to extract visual features from the image. Next extracting dielectric property of coconut using RF module. The extracted features are fused then input into the classification model, which classifies the maturity stage of the coconut. The classification result is then output to the user via the user interface. The system may also estimate the water and flesh content of the coconut using the extracted features and the regression model.
The present invention relates to a method for classifying coconuts based on their maturity and estimating their flesh and water content in non-destructive manner. The method comprises the following steps:
- capturing images of the coconuts using a device, such as a camera or smartphone;
- processing the images using an image processing algorithms to extract visual features from the images, such as the shape, size, color, and texture of the coconuts;
- applying deep learning algorithms to the processed images to perform visual perception analysis of the coconuts;
- using RF analysis (dielectric characterization) to collect detailed information about the dielectric property of the coconut;
- combining the results of the visual perception analysis, and RF analysis to classify the coconuts based on their maturity and estimate their flesh and water content.
In the present invention, the classification and regression models are adjusted based on previously obtained and labeled data to improve the accuracy of the classification and estimation.
Further, the step of analyzing the collected data includes utilizing deep learning techniques and RF Analysis (Dielectric Characterization) to classify the coconut and estimate its flesh and water content.
The method can be performed using a variety of devices, and can be improved and refined over time through a feedback (MLOps, Datacentric approach) mechanism. The resulting system provides a reliable and accurate method for classifying coconuts based on their maturity and estimating their flesh and water content. This can be useful for various applications such as improving quality of coconut value-added products, increasing quantity of coconut value-added with minimal wastage, increasing variety of coconut value-added products, quality control in the coconut export industry and research on coconut growth and development. , Claims:CLAIMS:
We claim:
1. A system for classifying coconuts based on maturity and estimating flesh and water content, the said system comprising:
- a device for capturing images of the coconuts;
- a deep learning module for performing visual perception analysis on the captured images;
- an RF (radio frequency) analysis module (dielectric characterization) for measuring the dielectric properties of the coconut;
- a processor for integrating the results of the visual perception analysis, and RF analysis to classify the coconuts based on their maturity and estimate the flesh and water content;
characterized by edge devices performing predictive analytics on the collected data, and machine learning model for improving model accuracy through a feedback [MLOps (Machine Learning Operations) or a data-centric approach] mechanism.
2. The system as claimed in Claim 1, wherein the system further consists of a memory for storing image processing models and classification models and a user interface for displaying classification results.
3. The system as claimed in Claim 1, wherein the RF analysis module measures permittivity, loss tangent, and reflection coefficient of the coconuts.
4. A method for classifying coconuts based on maturity and estimating flesh and water content, comprising the steps of:
- capturing images of the coconuts using a device;
- processing the image using an image processing model to extract visual features from the image;
- applying deep learning algorithms to the captured images to perform visual perception analysis of the coconuts;
- feeding the extracted visual features into a regression model to estimate the water and flesh content of the coconut
- using RF analysis (dielectric characterization) to collect detailed information about the dielectric properties of the coconut;
- combining the results of the visual perception analysis, and RF analysis to classify the coconuts based on their maturity and estimate the flesh and water content.
5. The method as claimed in Claim 4, wherein the classification and regression models are adjusted based on previously obtained and labeled coconut images to improve the accuracy of the classification and estimation.
6. The method as claimed in Claim 4, wherein the step of analyzing the collected data includes utilizing deep learning techniques to classify the coconut and estimate its flesh and water content.

Documents

Application Documents

# Name Date
1 202341058952-POWER OF AUTHORITY [02-09-2023(online)].pdf 2023-09-02
2 202341058952-FORM FOR STARTUP [02-09-2023(online)].pdf 2023-09-02
3 202341058952-FORM FOR SMALL ENTITY(FORM-28) [02-09-2023(online)].pdf 2023-09-02
4 202341058952-FORM 1 [02-09-2023(online)].pdf 2023-09-02
5 202341058952-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [02-09-2023(online)].pdf 2023-09-02
6 202341058952-EVIDENCE FOR REGISTRATION UNDER SSI [02-09-2023(online)].pdf 2023-09-02
7 202341058952-COMPLETE SPECIFICATION [02-09-2023(online)].pdf 2023-09-02
8 202341058952-FORM-9 [01-04-2024(online)].pdf 2024-04-01
9 202341058952-FORM 18 [01-04-2024(online)].pdf 2024-04-01
10 202341058952-FER.pdf 2025-06-17

Search Strategy

1 202341058952E_06-01-2025.pdf