Sign In to Follow Application
View All Documents & Correspondence

Integrated Smart Plant Care System

Abstract: INTEGRATED SMART PLANT CARE SYSTEM ABSTRACT An integrated smart plant care system (100) is disclosed. The system (100) comprises environmental sensors (102) to measure plant parameters in real-time and an imaging unit (104) configured to capture an image of a plant for visual analysis. A microcontroller (108) is configured to: receive the plant parameters from the environmental sensors (102); analyze the plant parameters corresponding to standard plant parameters pre-stored in a memory (106); actuate the imaging unit (104) to capture the image of the plant upon detecting a deviation in the analyzed plant parameters from any of the standard plant parameters; integrate the plant parameters and the captured image to assess a plant health; command a machine learning model (110) to analyze the integrated data to classify the plant health in classes; and generate recommendations for plant care based on the classifications. The system (100) features comprehensive multi-sensor monitoring. Claims: 10, Figures: 2 Figure 1 is selected.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
06 March 2025
Publication Number
12/2025
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
Parent Application

Applicants

SR University
SR University, Ananthasagar, Warangal Telangana India 506371 patent@sru.edu.in 08702818333

Inventors

1. Mr. G. Ashok
SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371.
2. D. Varsha
SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371.
3. P. Keerthi
SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371.
4. Ch. Keerthi
SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371.
5. N. Chandana
SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371.

Specification

Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a plant care system and particularly to an integrated smart plant care system.
Description of Related Art
[002] Effective plant care requires continuous monitoring of environmental factors such as soil moisture, temperature, humidity, and light exposure. Traditional methods rely on manual inspection and routine watering, that proves to be inconsistent and inefficient. Over time, sensor-based solutions have emerged to assist plant care, including soil moisture detectors, light sensors, and humidity monitors. While these technologies help maintain specific conditions, they typically operate in isolation, lacking a comprehensive approach to assessing overall plant health. More advanced systems, such as automated greenhouse solutions, integrate multiple sensors and, in some cases, imaging technologies, but these are often expensive and designed for commercial agriculture rather than home use.
[003] Despite these advancements, existing solutions have limitations in providing holistic and adaptive plant care. Most systems focus on isolated parameters without integrating environmental and visual assessments to detect early signs of stress, disease, or nutrient deficiencies. Furthermore, many solutions rely on static recommendations rather than real-time, data-driven insights that adapt to changing conditions. As a result, there is a need for a more accessible and intelligent plant care system that combines environmental monitoring, visual analysis, and adaptive recommendations to improve plant health management.
[004] There is thus a need for an improved and advanced integrated smart plant care system that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[005] Embodiments in accordance with the present invention provide an integrated smart plant care system. The system comprising environmental sensors configured to measure plant parameters in real-time. The plant parameters are selected from an oxygen level, a humidity, an atmospheric pressure, a light intensity, or a combination thereof. The system further comprising an imaging unit configured to capture an image of a plant for visual analysis. The visual analysis enables identification of a sign of stress, a discoloration, a disease infestation, a pest infestation, a nutrient deficiency, or a combination thereof. The system further comprising a microcontroller communicatively connected to the environmental sensors and to the imaging unit. The microcontroller is configured to receive the plant parameters from the environmental sensors; analyze the plant parameters corresponding to standard plant parameters pre-stored in a memory; actuate the imaging unit to capture the image of the plant upon detecting a deviation in the analyzed plant parameters from any of the standard plant parameters; integrate the plant parameters and the captured image to assess a plant health; command a machine learning model to analyze the integrated data to classify the plant health in classes; and generate recommendations for plant care based on the classifications.
[006] Embodiments in accordance with the present invention further provide a method for fostering plants using an integrated smart plant care system. The method comprising steps of receiving plant parameters from environmental sensors; analyzing the plant parameters corresponding to standard plant parameters pre-stored in a memory; actuating an imaging unit to capture an image of a plant upon detecting a deviation in the analyzed plant parameters from any of the standard plant parameters; integrating the plant parameters and the captured image to assess a plant health; commanding a machine learning model to analyze the integrated data to classify the plant health in classes; and generating recommendations for plant care based on the classification.
[007] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide an integrated smart plant care system.
[008] Next, embodiments of the present application may provide a plant care system that features comprehensive multi-sensor monitoring
[009] Next, embodiments of the present application may provide a plant care system that generates AI-driven adaptive recommendations.
[0010] Next, embodiments of the present application may provide a plant care system that is eligible for automation and smart integration.
[0011] These and other advantages will be apparent from the present application of the embodiments described herein.
[0012] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0014] FIG. 1 illustrates a block diagram of an integrated smart plant care system, according to an embodiment of the present invention; and
[0015] FIG. 2 depicts a flowchart of a method for fostering plants using an integrated smart plant care system, according to an embodiment of the present invention.
[0016] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0017] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0018] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0019] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0020] FIG. 1 illustrates a block diagram of an integrated smart plant care system 100 (hereinafter referred to as the system 100), according to an embodiment of the present invention. The system 100 may be adapted to monitor a health of a plant. The system 100 may further be adapted to generate and transmit recommendation for fostering and a good health care taking of the plant. The plant may be, but not limited to, a flowering plant, a crop, a shrub, a herb, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the plant, including known, related art, and/or later developed technologies. The system 100 may be installed at locations such as, but not limited to, a greenhouse, a farmhouse, a testing lab, and so forth. Embodiments of the present invention are intended to include or otherwise cover any location, including known, related art, and/or later developed technologies, for installation of the system 100.
[0021] The system 100 may comprise environmental sensors 102, an imaging unit 104, a memory 106, a microcontroller 108, a machine learning model 110, external accessories 112, and a computing device 114.
[0022] In an embodiment of the present invention, the environmental sensors 102 may be installed in a vicinity of the plant. The environmental sensors 102 may be adapted to measure plant parameters in real-time. The plant parameters may be, but not limited to, an oxygen level, a humidity, an atmospheric pressure, a light intensity, and so forth. The environmental sensors 102 may comprise sensors such as, but not limited to, an oxygen sensor, a temperature sensor, a moisture sensor, a pressure sensor, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the sensors, including known, related art, and/or later developed technologies.
[0023] In an embodiment of the present invention, the imaging unit 104 may be installed in a visual proximity of the plant. The imaging unit 104 may be adapted to capture an image of the plant. The image of the plant captured by the imaging unit 104 may allow a visual analysis of the plant. The visual analysis may enable identification of factors such as, but not limited to, a sign of stress, a discoloration, a disease infestation, a pest infestation, a nutrient deficiency, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the factors, including known, related art, and/or later developed technologies. The imaging unit 104 may be, but not limited to, an X-Ray imager, an infrared imager. In a preferred embodiment of the present invention, the imaging unit 104 may be a camera. Embodiments of the present invention are intended to include or otherwise cover any type of the imaging unit 104, including known, related art, and/or later developed technologies.
[0024] In an embodiment of the present invention, the memory 106 may be adapted to store standard plant parameters. Further, the memory 106 may be adapted to store the plant parameters measured by the environmental sensors 102. The memory 106 may further be adapted to store the image of the plant captured by the imaging unit 104. The memory 106 unit may be, but not limited to, a hard drive, a flash drive, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the memory 106, including known, related art, and/or later developed technologies.
[0025] In an embodiment of the present invention, the microcontroller 108 may be connected to the environmental sensors 102 and to the imaging unit 104. The microcontroller 108 may further be connected to the memory 106. The microcontroller 108 may be configured to receive the plant parameters from the environmental sensors 102. The microcontroller 108 may be configured to analyze the plant parameters corresponding to the standard plant parameters pre-stored in the memory 106.
[0026] Upon analysis, if the analyzed plant parameters deviate from any of the standard plant parameters, then the microcontroller 108 may be configured to actuate the imaging unit 104 to capture the image of the plant. The microcontroller 108 may be configured to integrate the plant parameters and the captured image to assess the plant health.
[0027] The microcontroller 108 may be configured to command the machine learning model 110 to analyze the integrated data to classify the plant health in classes. The machine learning model 110 may be, but not limited to, a deep learning model, a neural network model, and so forth. The classes may be, but not limited to, a best class, a good class, a poor class, a dead class, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the classes, including known, related art, and/or later developed technologies. The microcontroller 108 may further be configured to generate recommendations for plant care based on the classifications. The recommendations may be, but not limited to, administration of pesticides, addition of fertilizers, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the recommendations, including known, related art, and/or later developed technologies.
[0028] In an exemplary scenario, a rose plant may being fostered in an exemplary nursery. The environmental sensors may measure the plant parameters of the exemplary rose plant with the oxygen level as 50% and the humidity as 10%. Moreover, the standard plant parameters for the exemplary rose plant may be 100% oxygen level and 50% humidity. As the plant parameters may be less than the standard plant parameters, the microcontroller 108 may activate the imaging unit to capture the image of the exemplary rose plant. The captured image of the exemplary rose plant may be utilized for the visual analysis, and the exemplary rose plant may be identified with the nutrient deficiency. Further, the measured plant parameters and the captured image of the exemplary rose plant may be integrated to access a health of the exemplary rose plant. The integrated data may further be analyzed by the machine learning model 110 to classify the health of the rose plant as the poor class. Further, the microcontroller 108 may generate an exemplary recommendation of watering and providing sunlight to the exemplary rose plant, for the plant care.
[0029] Further, the microcontroller 108 may be configured dynamically adjust calibration of the environmental sensors 102 based on environmental trends to enhance detection accuracy based on the visual analysis of the captured images by the imaging unit 104. For example, if the humidity sensor initially detects 45% humidity as normal but multiple captured images show wilting in the rose plant at this level, the microcontroller 108 may recalibrate the humidity sensor to recognize 45% as a critical low threshold. Consequently, future readings of 45% humidity or lower will trigger a preventive action, such as increasing humidity through an automated misting system or providing early watering recommendations.
[0030] Additionally, if a series of captured images indicate that the rose plant’s leaves appear wilted under a measured light intensity of 70 lux, the microcontroller 108 may recalibrate the light sensor to consider 70 lux as insufficient lighting for the rose plant. Consequently, future measurements below this threshold may trigger an early recommendation for increasing light exposure, thereby improving the precision of plant health assessments.
[0031] In an embodiment of the present invention, the microcontroller 108 of the system 100 may be enabled to perform dual-stage analysis using the environmental sensors 102 and the imaging unit 104. The environmental sensor 102-based analysis may be conducted to provide real-time quantitative measurement of plant parameters, while the visual analysis allows detection of plant stress indicators that may not be immediately reflected in the sensor data. For instance, the environmental sensors 102 may detect a drop in humidity levels, but early-stage fungal infection or pest infestation may only be observable through visual cues such as discoloration or leaf deformities. Similarly, a nutrient deficiency may not always cause an immediate deviation in measurable parameters but may manifest as yellowing of leaves or stunted growth, which can be detected through image analysis by the imaging unit 104. By integrating both analysis stages, the system 100 ensures a comprehensive health assessment by reducing a risk of false positives or delayed responses.
[0032] Furthermore, the dual-stage analysis may also be beneficial in detecting overwatering conditions. For instance, the environmental sensors 102 may indicate adequate soil moisture levels, yet the plant may exhibit signs of root rot, such as wilting or darkened roots, which are not immediately reflected in the sensor data. The imaging unit 104 may capture these visual indicators, allowing the microcontroller 108 to differentiate between healthy hydration levels and excessive watering. This integration ensures that corrective actions, such as adjusting irrigation frequency, are based on both quantitative data and visual symptoms by enhancing precision in plant care recommendations.
[0033] The microcontroller 108 may be configured to transmit the generated recommendation to the computing device 114. The microcontroller 108 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the microcontroller 108, including known, related art, and/or later developed technologies. In an embodiment of the present invention, the microcontroller 108 may execute specialized algorithms to perform the dual-stage analysis using the environmental sensors 102 and the imaging unit 104. The environmental sensor 102-based analysis may be conducted using a Threshold-Based Deviation Detection Algorithm, wherein real-time plant parameters are continuously compared against predefined standard plant parameters stored in the memory 106. Any deviation beyond an adaptive threshold may trigger an alert and initiate further analysis.
[0034] The microcontroller 108 may further be configured to employ an Environmental Sensor Fusion Algorithm may integrate data from plurality of the environmental sensors 102 to enhance detection accuracy. For example, a decision tree model may correlate humidity levels, soil moisture, and temperature to predict potential stress conditions before they manifest visually. Upon detecting abnormal patterns, the microcontroller 108 may activate the imaging unit 104 to verify an anomaly through visual inspection.
[0035] The microcontroller 108 may employ a Convolutional Neural Network (CNN)-Based Image Classification Algorithm for the visual analysis by the captured images to identify plant stress indicators such as discoloration, deformities, or fungal growth. The CNN Algorithm may be pre-trained with datasets of healthy and unhealthy plant images for allowing it to classify the plant condition based on captured images. The microcontroller 108 may further be configured to implement Feature Extraction and Anomaly Detection Algorithms to identify subtle stress patterns, such as texture variations or minor color shifts, which may indicate early-stage deficiencies or diseases.
[0036] In an embodiment of the present invention, the external accessories 112 may be arranged in the vicinity of the plant. The external accessories 112 may be activated upon receipt of a digital signal from the microcontroller 108. The external accessories 112 may be adapted for fostering of the plants. The external accessories 112 may be, but not limited to, a grow light, an irrigation system, and so forth. Embodiments of the present invention are intended to include or otherwise cover any external accessories 112, including known, related art, and/or later developed technologies.
[0037] Further, the microcontroller 108 may be configured to dynamically regulate the operation of the external accessories 112 based on real-time plant parameters and historical trends. For example, the microcontroller 108 may modulate the intensity and duration of the grow light based on ambient light conditions to optimize photosynthesis efficiency while conserving energy. Additionally, the microcontroller 108 may control the irrigation system by adjusting water flow rates based on detected soil moisture levels, thereby preventing overwatering or underwatering. Moreover, the microcontroller 108 may prioritize and schedule the activation of multiple external accessories 112 based on a hierarchy of plant needs, ensuring essential conditions such as optimal humidity and temperature are maintained before secondary conditions such as nutrient delivery.
[0038] In an embodiment of the present invention, the computing device 114 may be an electronic device used by a user. The computing device 114 may be adapted to receive the recommendation generated by the microcontroller 108. The computing device 114 may further be adapted to display real-time messages, graphical health trends, actionable insights, and so forth. The computing device 114 may be, but not limited to, a laptop, a mobile, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the computing device 114, including known, related art, and/or later developed technologies.
[0039] For example, upon detecting low soil moisture levels and signs of wilting in the plant through the imaging unit 104, the microcontroller 108 may generate a recommendation for immediate watering. This recommendation may be transmitted to the computing device 114, where the user receives a real-time notification such as "Water your plant now to prevent dehydration." Additionally, if prolonged low light conditions are detected, the computing device 114 may display a graphical trend indicating insufficient sunlight exposure over time, along with a suggestion to relocate the plant or activate a grow light. For instance, the microcontroller 108 may refine its recommendations over time by learning from user actions and corresponding plant responses. If a user consistently ignores or modifies watering recommendations, the microcontroller 108 may adjust its future alerts based on user behavior and environmental conditions.
[0040] Further, integration of an AI-driven conversational assistant within the computing device 114 may enhance user interaction. For example, instead of passive notifications, the computing device 114 may enable interactive queries such as "Would you like to activate the irrigation system now?" or "Would you like a weekly health report of your plant?" This not only enhances user engagement but also contributes to automation by enabling hands-free plant care management. Additionally, real-time anomaly detection using comparative historical data may further improve a reliability of the system 100. For instance, if the environmental sensors 102 detect a sudden drop in humidity beyond normal seasonal variations, the microcontroller 108 may generate an alert regarding potential external factors, such as HVAC impact or sudden weather shifts to recommend corrective actions accordingly.
[0041] FIG. 2 depicts a flowchart of a method 200 for fostering plants using the system 100, according to an embodiment of the present invention.
[0042] At step 202, the system 100 may receive the plant parameters from the environmental sensors 102.
[0043] At step 204, the system 100 may analyze the plant parameters corresponding to standard plant parameters pre-stored in the memory 106.
[0044] At step 206, if the analyzed plant parameters deviate from any of the standard plant parameters, then the method 200 may proceed to a step 208. Else, the method 200 may revert to a step 202.
[0045] At step 208, the system 100 may actuate the imaging unit 104 to capture the image of the plant.
[0046] At step 210, the system 100 may integrate the plant parameters and the captured image to assess the plant health.
[0047] At step 212, the system 100 may command the machine learning model 110 to analyze the integrated data to classify the plant health in the classes.
[0048] At step 214, the system 100 may generate recommendations for the plant care based on the classifications.
[0049] At step 216, the system 100 may transmit the generated recommendation to the computing device 114.
[0050] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0051] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. An integrated smart plant care system (100), the system (100) comprising:
environmental sensors (102) configured to measure plant parameters in real-time, wherein the plant parameters are selected from an oxygen level, a humidity, an atmospheric pressure, a light intensity, or a combination thereof;
an imaging unit (104) configured to capture an image of a plant for visual analysis, wherein the visual analysis enables identification of a sign of stress, a discoloration, a disease infestation, a pest infestation, a nutrient deficiency, or a combination thereof;
a microcontroller (108) communicatively connected to the environmental sensors (102) and to the imaging unit (104), characterized in that the microcontroller (108) is configured to:
receive the plant parameters from the environmental sensors (102);
analyze the plant parameters corresponding to standard plant parameters pre-stored in a memory (106);
actuate the imaging unit (104) to capture the image of the plant upon detecting a deviation in the analyzed plant parameters from any of the standard plant parameters;
integrate the plant parameters and the captured image to assess a plant health;
command a machine learning model (110) to analyze the integrated data to classify the plant health in classes; and
generate recommendations for plant care based on the classifications.
2. The system (100) as claimed in claim 1, wherein the microcontroller (108) is configured dynamically adjust calibration of the environmental sensors (102) based on environmental trends to enhance detection accuracy based on the visual analysis of the captured images by the imaging unit (104).
3. The system (100) as claimed in claim 1, wherein the microcontroller (108) is configured to transmit the generated recommendation to a computing device (114).
4. The system (100) as claimed in claim 1, wherein the microcontroller (108) is configured to integrate and activate external accessories (112) selected from a grow light, an irrigation system, or a combination thereof.
5. The system (100) as claimed in claim 1, wherein the environmental sensors (102) comprise an oxygen sensor, a temperature sensor, a moisture sensor, a pressure sensor, or a combination thereof.
6. The system (100) as claimed in claim 1, wherein the computing device (114) is adapted to display real-time messages, graphical health trends, actionable insights, or a combination thereof.
7. The system (100) as claimed in claim 1, wherein the imaging unit (104) is a camera.
8. A method (200) for fostering plants using an integrated smart plant care system (100), characterized in that the method (200) is characterized by steps of:
receiving plant parameters from environmental sensors (102);
analyzing the plant parameters corresponding to standard plant parameters pre-stored in a memory (106);
actuating an imaging unit (104) to capture an image of a plant upon detecting a deviation in the analyzed plant parameters from any of the standard plant parameters;
integrating the plant parameters and the captured image to assess a plant health;
commanding a machine learning model (110) to analyze the integrated data to classify the plant health in classes; and
generating recommendations for plant care based on the classification.
9. The method (200) as claimed in claim 7, comprising a step of transmitting the generated recommendation to a computing device (114).
10. The method (200) as claimed in claim 7, wherein the captured images enable identification of a sign of stress, a discoloration, a disease infestation, a pest infestation, a nutrient deficiency, or a combination thereof.
Date: March 5, 2025
Place: Noida

Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant

Documents

Application Documents

# Name Date
1 202541019978-STATEMENT OF UNDERTAKING (FORM 3) [06-03-2025(online)].pdf 2025-03-06
2 202541019978-REQUEST FOR EARLY PUBLICATION(FORM-9) [06-03-2025(online)].pdf 2025-03-06
3 202541019978-POWER OF AUTHORITY [06-03-2025(online)].pdf 2025-03-06
4 202541019978-OTHERS [06-03-2025(online)].pdf 2025-03-06
5 202541019978-FORM-9 [06-03-2025(online)].pdf 2025-03-06
6 202541019978-FORM FOR SMALL ENTITY(FORM-28) [06-03-2025(online)].pdf 2025-03-06
7 202541019978-FORM 1 [06-03-2025(online)].pdf 2025-03-06
8 202541019978-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [06-03-2025(online)].pdf 2025-03-06
9 202541019978-EDUCATIONAL INSTITUTION(S) [06-03-2025(online)].pdf 2025-03-06
10 202541019978-DRAWINGS [06-03-2025(online)].pdf 2025-03-06
11 202541019978-DECLARATION OF INVENTORSHIP (FORM 5) [06-03-2025(online)].pdf 2025-03-06
12 202541019978-COMPLETE SPECIFICATION [06-03-2025(online)].pdf 2025-03-06
13 202541019978-Proof of Right [13-05-2025(online)].pdf 2025-05-13