Sign In to Follow Application
View All Documents & Correspondence

A System And A Device For Identifying Plant Diseases

Abstract: ABSTRACT A SYSTEM AND A DEVICE FOR IDENTIFYING PLANT DISEASES The present disclosure envisages a system (100) for identifying plant diseases. The disclosed system (100) comprises a smart eyewear (105) and a local host (110). The smart eyewear (105) comprises an image capturing unit (202) which captures a plurality of images of surrounding and provides a live feed; a communication unit (204) streams the live feed to the local host (110). Further a display unit (206) receives and displays a processed stream received from the local host. The local host (110) comprises a data capturing unit (208) and a processing unit (210). The data capturing unit (208) captures the stream and pre-processes the same using at least one image processing technique to provide a pre-processed stream of images. The processing unit (210) processes the pre-processed stream using a machine learning algorithm in conjunction with an optimized library of plant diseases stored in a repository (212) to provide the processed stream.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 September 2020
Publication Number
49/2020
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
dewan@rkdewanmail.com
Parent Application
Patent Number
Legal Status
Grant Date
2025-04-07
Renewal Date

Applicants

1. SRM Institute of Science and Technology
Kattankulathur, Chennai-603203, Tamil Nadu, India

Inventors

1. VIJAYAKUMAR, Ponnusamy
Plot no.8, S1, 3rd street, Jaivanthapuram Madambakkam Chennai-600126, Tamilnadu, India
2. COUMARAN, Amrith
No.4/2, Kalaivani 2nd Street, Srinivasa Nagar, New Perungalathur, Chennai-600063, Tamil Nadu, India
3. SUBRAMANIAN SHUNMUGAM, Akhash
AS2, Ascent Akshayam, Ramanujam Street, New Perungalathur, Chennai 600063, Tamil Nadu, India
4. RAJARAM, Kritin
F23, K-block, Ruby Paradise, 33 Easwari Nagar Main Road, Selaiyur, East Tambaram, Chennai 600059, Tamil Nadu, India
5. SENTHILVELAVAN, Sanoj
21A2, Amal Nagar, Kishkinta Road, West Tambaram Chennai 600045, Tamil Nadu, India

Specification

Claims:WE CLAIM:
1. A system (100) for identifying plant diseases, said system (100) comprising:
a) a smart eyewear (105) comprising:
a. an image capturing unit (202) configured to capture a plurality of images of a surrounding and provide a live feed of said plurality of images;
b. a communication unit (204) configured to stream said live feed to a local host (110); and
c. a display unit (206) in communication with said communication unit (204) to receive and display a processed stream received from the local host (110);
wherein the local host (110) comprises:
a. a repository (212);
b. a data capturing unit (208) configured to capture the stream of the live feed received from the communication unit (204), storing the same in said repository (212), and pre-processing the same using at least one image processing technique to provide a pre-processed stream; and
c. a processing unit (210) configured to process the pre-processed stream in real-time using a machine learning algorithm in conjunction with an optimized library of plant diseases pre-stored in the repository (212) to provide said processed stream.
2. The system (100) as claimed in claim 1, wherein said processing unit (210) is configured to employ at least one neural network algorithm to update the optimized library based on the processing of said live feed.
3. The system (100) as claimed in claim 2, wherein said processing unit (210) is configured to employ at least one prediction model to provide operational suggestions to a user.
4. An eyewear (105) for identifying plant diseases in real time, said eyewear (105) comprising:
a) an image capturing unit (202) positioned on or embedded in a body of the eyewear (105), the image capturing unit (202) configured to capture a live feed of the surroundings to capture images of plants; and
b) a communication unit (204) configured to stream said live feed onto a local host (110) communicatively coupled to the eyewear (105), the local host (110) adapted to host an optimized library of plants and plant diseases, and receive from the local host (110) a processed output stream in real-time; and
c) a display unit (206) in communication with the communication unit (204) to display the processed output stream.
, Description:TECHNICAL FIELD
The present disclosure generally relates to disease detection in plants, and more particularly to a system and a device for identifying diseases in plants.
BACKGROUND
It is highly desirable in an agricultural economy like India that any plant infection/disease is detected early to save the crops. If an infectious plant disease is not treated in time, the farmers would suffer heavy losses, which will directly have an adverse impact on the country’s economy.
Patent document WO2019244156A1 discloses a system for in situ imaging of plant tissue and methods of using same for identifying plant pests and diseases are provided. The system includes a camera having a macro lens for near field imaging and a spacer configured for setting a focal distance between the macro lens and a portion of a plant.
The above system is advantageous in identifying plant diseases; however, its implementation, using robots, is not cost-effective for a developing country like India.
There is, therefore, felt a need for systems/devices for identifying plant diseases using simpler technological interventions and ways that would alleviate the drawbacks in conventional systems/devices.
OBJECTS
Some of the objects of the present disclosure, which at least one embodiment herein satisfies, are as follows:
An object of the present disclosure is to provide a system and/or a device for identifying plant diseases that is cost effective and easy to implement and use.
Another object of the present disclosure is to provide a system and/or device that eliminate the requirement of manually assessing the plants for infestation or with the help of aerial or robot-assisted surveying.
Yet another object of the present disclosure is to provide a plant infestation detection system and/or device that is user friendly.
Other objects and advantages of the present disclosure will be more apparent from the following description, which is not intended to limit the scope of the present disclosure.
SUMMARY
The present disclosure envisages a system and a device for identifying plant diseases is a user-friendly manner. The system comprises a smart eyewear, and a local host. The smart eyewear comprises an image capturing unit, a communication unit, and a display unit. The image capturing unit is configured to capture a plurality of images of a surrounding and provide a live feed of the plurality of images. The communication unit is configured to stream the live feed to the local host. The display unit is in communication with the communication unit to receive and display a processed stream received from the local host. The local host comprises a repository, a data capturing unit and a processing unit. The data capturing unit is configured to capture the stream of the live feed received from the communication unit, storing the same in the repository, and pre-processing the same using at least one image processing technique to provide a pre-processed stream. Further, the processing unit is configured to process the pre-processed stream in real-time using a machine learning algorithm in conjunction with an optimized library of plant diseases pre-stored in the repository to provide the processed stream.
In an embodiment, the processing unit is configured to employ at least one neural network algorithm to update the optimized library based on the processing of the live feed.
In another embodiment, the processing unit is configured to employ at least one prediction model to provide operational suggestions to a user.
Further, the present disclosure envisages an eyewear for identifying plant diseases in real time. The eyewear comprises an image capturing unit, a communication unit, and a display unit. The image capturing unit is positioned on or embedded in a body of the eyewear. The image capturing unit is configured to capture a live feed of the surroundings to capture images of plants. The communication unit is configured to stream the live feed onto a local host communicatively coupled to the eyewear. The local host is adapted to host an optimized library of plants and plant diseases and receive from the local host a processed output stream in real-time. The display unit is in communication with the communication unit to display the processed output stream.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWING
A system and a device for identifying plant diseases in accordance with the present disclosure will now be described with the help of the accompanying drawing, in which:
FIGURE 1 illustrates a network diagram of a system for identifying plant diseases using a smart eyewear and a local host, in accordance with an embodiment;
FIGURE 2 illustrates a block diagram of the system of Fig. 1; and
Figures 3 illustrates a method flow for identifying plant diseases using the system of Fig. 1, in accordance with an embodiment.
LIST OF REFERENCE NUMERALS
100 – System
105– Smart Eyewear
110 – Local Host
115 – Network
202 – Image Capturing Unit
204 – Communication Unit
206 – Display Unit
208 – Data Capturing Unit
210 – Processing Unit
212 –Repository
DETAILED DESCRIPTION
Embodiments, of the present disclosure, will now be described with reference to the accompanying drawing.
Embodiments are provided so as to thoroughly and fully convey the scope of the present disclosure to the person skilled in the art. Numerous details are set forth relating to specific components, and methods, to provide a complete understanding of embodiments of the present disclosure. It will be apparent to the person skilled in the art that the details provided in the embodiments should not be construed to limit the scope of the present disclosure. In some embodiments, well-known processes, well-known apparatus structures, and well-known techniques are not described in detail.
The terminology used, in the present disclosure, is only for the purpose of explaining a particular embodiment and such terminology shall not be considered to limit the scope of the present disclosure. As used in the present disclosure, the forms "a,” "an," and "the" may be intended to include the plural forms as well, unless the context clearly suggests otherwise. The terms "comprises," "comprising," “including,” and “having,” are open ended transitional phrases and therefore specify the presence of stated features, integers, steps, operations, elements, modules, units and/or components, but do not forbid the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The particular order of steps disclosed in the method and process of the present disclosure is not to be construed as necessarily requiring their performance as described or illustrated. It is also to be understood that additional or alternative steps may be employed.
As used herein, the term "and/or" includes any and all combinations of one or more.
In the field of agriculture and plantation, a major problem faced by the farmers to handle pests and plant diseases. Although remedies are available for treating various types of plant diseases, identifying a specific disease in a plant requires skill and knowledge. With less knowledge on the kind of infestation of the crops, farmers usually suffer heavy losses each year. To overcome these issues, an automated system (hereinafter referred to as “system 100”) for effectively and efficiently identifying plant diseases is described with reference to Figure 1 and Figure 2.
The present disclosure relates to a system (100) for on-field detection and identification of plant diseases, pests and other plant pathogens in real-time. The present system (100) includes a smart eyewear (105) that can be worn by a user surveying the crops and the vegetation on-field and a local host device (110) coupled to the smart eyewear (105). The smart eyewear (105), hereinafter interchangeably as the “eyewear” or simply the “device”, and the local host (110) are capable of communicating over a network (115).
In an embodiment, the eyewear (105) is configured to capture/record a live feed of the surroundings and stream the live feed to the local host (110) over the network (115) using inbuilt capabilities in the smart eyewear (105). On the other hand, the local host (110) is configured to analyze the live feed received from the smart eyewear (105) and based on the analysis help the user to identify whether the plant or the crop is infested by any disease and if yes, which disease, to be precise.
In an embodiment, the local host (110) employs at least one machine learning algorithm on the received live feed and generates an analyzed live feed. The local host (110) then transmits the analyzed live feed to the smart eyewear (105) through which the user can identify the kind and the extent of the disease in a given plant and its impact.
In an operative embodiment, the eyewear (105) includes an image capturing unit (202), a communication unit (204), and a display unit (206).
In an embodiment, the image capturing unit (202) is positioned on a body of the smart eyewear (200). In another embodiment, the image capturing unit (202) is embedded in the body of the smart eyewear (200). The image capturing unit (202) is configured to capture a plurality of images of the surrounding and provide a live feed. The image capturing unit (202) may include a digital camera, an infrared or near Infrared camera, an image sensor, a CMOS, or CCD image sensor, a thermal camera, and a CSI camera. In an embodiment, the image capturing unit (202) is capable of automatically focusing on a feature of an object. In an exemplary embodiment, the image capturing unit (202) includes an attached light source to enhance visibility and clarity of captured images. The image capturing unit (202) is communicatively coupled to the communication unit (204).
The communication unit (204) is configured to receive the live feed from the image capturing unit (202) and is further configured to stream the live feed onto the local host (110). In an embodiment, the communication unit (204) is configured to communicate with the local host (110) using, for example, a Bluetooth connection, a Zigbee connection, and a wi-fi module.
The communication unit (204) is also configured to transmit the processed live stream from the local host (110) back to the eyewear (105), through the communication unit (204), to be displayed on the display unit (206). The display unit (206) is configured to display the same to the user in real-time.
In an embodiment, the network (115) may include the Internet, wireless network, wired network, one or more telecommunications networks (e.g., Public Switched Telephone Networks (PSTNs)), a wired or wireless network, a wireless area network, a Wireless Video Are Network (WVAN), a Local Area Network (LAN), a WLAN, a PAN, a WPAN, WANs, metropolitan area networks (MANs), an intranet, the Internet, or a cable network (e.g., an optical cable network).
In an embodiment, the eyewear (105) is structured as a head-worn device. The device (105) includes at least one power source for supplying power to it various electronic components as mentioned. In an embodiment, the power source is a rechargeable battery.
The local host (110), on the other hand, includes a data capturing unit (208), a processing unit (210), and a repository (212). In an operative embodiment, the data capturing unit (208) captures the stream received from the communication unit (204) and pre-processes the captured stream to provide a pre-processed stream to the processing unit (210). In an embodiment, the data capturing unit (208) can be interfaced with the smart eyewear (105) by wired connection or a wireless connection. In an embodiment, the pre-processing may include one or more image processing techniques known in the art. In an embodiment, the data capturing unit (208) segments the live feed into frames and corrects the image for clarity.
The data capturing unit (208) feeds the pre-processed stream to the processing unit (210), wherein the processing unit (210) is configured to process the stream to provide a processed stream, for example, by processing each of the frames as received from the data capturing unit (208).
In an embodiment, the processing unit (210) processes the captured stream using a machine learning algorithm in conjunction with an optimized library of plant diseases to provide the processed stream. In an operative embodiment, the processing unit (210) is configured to capture at least one image of the surrounding, wherein the at least one image includes a plurality of test plants whose health condition is to be assessed. In an embodiment, the processing unit (210) captures the image that has sufficient focus level and compares it with the data in the optimized library to provide a highly reliable and visually enhanced processed stream with matched results.
In an embodiment, the machine learning algorithm is selected from a group of a reinforcement learning, Linear Regression, a Nearest Neighbor, a Guassian Naive Bayes, a Decision Trees, a Support Vector Machine (SVM), and a Random Forest.
In an embodiment, the optimized library is stored in the repository (212) as a dataset. The dataset may include, for example, a list of plant species, profile information of each of the plant species, attributes of each plant in a given species, and diseases corresponding to each of the plant species. The profile information includes at least one image of a healthy leaf of the associated plant.
In an embodiment, the processing unit (210) is configured to employ various predictions models and machine learning algorithms as well. In an embodiment, the processing unit (210) is configured to employ at least one neural network algorithm to update the optimized library based on the processing of the live feed. In another embodiment, the processing unit (210) is configured to employ at least one prediction model to provide operational suggestions to a user.
In an embodiment, the neural network algorithm is selected from a group of a fast RCNN algorithm, an SSD algorithm, a deep neural network algorithm, and a convolution Neural network algorithm.
In an embodiment, the at least one prediction model is selected from a group of an Ordinary Least Squares, a Generalized Linear Models (GLM), a Logistic Regression, and a Multivariate Adaptive Regression Splines (MARS).
Any repository discussed herein may include relational, hierarchical, graphical, or object-oriented structure and/or any other database configurations. Common database products that may be used to implement the databases include DB2 by IBM (White Plains, N.Y.), various database products available from Oracle Corporation (Redwood Shores, Calif.), Microsoft Access or Microsoft SQL Server by Microsoft Corporation (Redmond, Wash.), MySQL, or any other suitable database product. Moreover, the databases may be organized in any suitable manner, for example, as data tables or lookup tables. Each record may be a single file, a series of files, a linked series of data fields or any other data structure. Association of certain data may be accomplished through any desired data association technique such as those known or practiced in the art.
In an embodiment, the test plant attributes are selected from a group consisting of, but not limited to, length of the test plant by measuring major axis which is distance between a base and an apex of the test plant, width of the test plant by measuring a long line perpendicular to the major axis connecting two points of the test plant, Leaf Area Index (LAI) or NDVI (Normalized Difference Vegetation Index), and level of coloring/pigmentation. In an embodiment, the system (100) is capable to detect plant disease by mapping different attributes of different parts of a test plant and analyzing them in totality, thereby enhancing accuracy.
In an embodiment, the processing unit (210) is configured to highlight the infected plants by a red color whereas green color represents a healthy plant. In this way, by displaying such processed stream on the display unit (206), the present system (100) is configured to provide an easy way of determining/distinguishing the healthy plant from the infected plants in real time for the user.
In an embodiment, the processing unit (210) includes one or more processor(s). The processor may be a general–purpose processor, a microprocessor, a microcomputer, a microcontroller, a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), a state machine, a logic circuitry, a Digital Signal Processor (DSP), and/or the like.
Figure 3 illustrates a method flow for the detection of plant diseases, according to an embodiment of the present disclosure.
At step 305, an image capturing unit (202) captures a plurality of images of surrounding and is further configured to provide a live feed to a communication unit (204).
At step 310, the communication unit (214), after receiving the live feed from the image capturing unit (202), transmits the received live stream towards a local host (110).
At step 315, a data capturing unit (208) captures the live stream from the communication unit (204). Further, the data capturing unit (208) is configured to pre-process the captured live stream using at least one image processing technique and provide a pre-processed stream.
At step 320, a processing unit (210) is configured to process the pre-processed stream to provide the processed stream in real-time. The processing unit (210) employs a machine learning algorithm in conjunction with an optimized library of plant diseases stored in a repository (212) to generate the processed stream. In an embodiment, the processing unit (210) is configured to employ at least one neural network algorithm to update the optimized library based on the processing of the live feed.
At step 325, the local host (110) transmits the processed stream towards the smart eyewear (105).
At step 330, a display unit (206) receives the processed stream from the local host (110) via a communication unit (204). Further the display unit (206) is configured to display the processed stream.
TECHNICAL ADVANCEMENTS
The present disclosure described herein above has several technical advantages including, but not limited to, the realization of a system and a device for identifying plant diseases, that:
• is cost effective and easy to implement and use.
• eliminates the requirement of manually assessing the plants for infestation or with the help of aerial or robot-assisted surveying.
• that is portable and user friendly.
The embodiments herein and the various features and advantageous details thereof are explained with reference to the non-limiting embodiments in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein
The use of the expression “at least” or “at least one” suggests the use of one or more elements or ingredients or quantities, as the use may be in the embodiment of the disclosure to achieve one or more of the desired objects or results.
`The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
ECONOMIC SIGNIFICANCE
One of the objects of the Patent Law is to provide protection to new technologies in all fields and domain of technologies. The new technologies shall or may contribute in the country economy growth by way of involvement of new efficient and quality method or product manufacturing in India.
To provide the protection of new technologies by patenting the product or process will contribute significant for innovation development in the country. Further by granting patent the patentee can contribute to manufacturing the new product or new process of manufacturing by himself or by technology collaboration or through the licensing.
The applicant submits that the present disclosure will contribute to country economy, which is one of the purposes to enact the Patents Act, 1970. The product in accordance with present invention will be in great demand in country and worldwide due to novel technical features of a present invention is a technical advancement in the agriculture field. The technology in accordance with present disclosure will provide product cheaper and result in saving in time of total process of manufacturing. The saving in production time will improve the productivity, and cost cutting of the product, which will directly contribute to economy of the country.
The product will contribute new concept in an agriculture application wherein patented process/product will be used. The present disclosure will replace the whole concept of detecting and identifying diseases in plants to prevent economic, social and ecological losses. The product is developed in the national interest and will contribute to country economy.
The economy significance details requirement may be called during the examination. Only after filing of this Patent application, the applicant can work publically related to present disclosure product/process/method. The applicant will disclose all the details related to the economic significance contribution after the protection of invention.

Documents

Application Documents

# Name Date
1 202041041525-AMMENDED DOCUMENTS [28-10-2024(online)].pdf 2024-10-28
1 202041041525-IntimationOfGrant07-04-2025.pdf 2025-04-07
1 202041041525-STATEMENT OF UNDERTAKING (FORM 3) [24-09-2020(online)].pdf 2020-09-24
2 202041041525-FORM 13 [28-10-2024(online)].pdf 2024-10-28
2 202041041525-PatentCertificate07-04-2025.pdf 2025-04-07
2 202041041525-PROOF OF RIGHT [24-09-2020(online)].pdf 2020-09-24
3 202041041525-AMMENDED DOCUMENTS [28-10-2024(online)].pdf 2024-10-28
3 202041041525-MARKED COPIES OF AMENDEMENTS [28-10-2024(online)].pdf 2024-10-28
3 202041041525-POWER OF AUTHORITY [24-09-2020(online)].pdf 2020-09-24
4 202041041525-Written submissions and relevant documents [28-10-2024(online)].pdf 2024-10-28
4 202041041525-FORM 13 [28-10-2024(online)].pdf 2024-10-28
4 202041041525-FORM 1 [24-09-2020(online)].pdf 2020-09-24
5 202041041525-MARKED COPIES OF AMENDEMENTS [28-10-2024(online)].pdf 2024-10-28
5 202041041525-DRAWINGS [24-09-2020(online)].pdf 2020-09-24
5 202041041525-Correspondence to notify the Controller [10-10-2024(online)].pdf 2024-10-10
6 202041041525-Written submissions and relevant documents [28-10-2024(online)].pdf 2024-10-28
6 202041041525-FORM-26 [10-10-2024(online)].pdf 2024-10-10
6 202041041525-DECLARATION OF INVENTORSHIP (FORM 5) [24-09-2020(online)].pdf 2020-09-24
7 202041041525-US(14)-HearingNotice-(HearingDate-17-10-2024).pdf 2024-10-04
7 202041041525-Correspondence to notify the Controller [10-10-2024(online)].pdf 2024-10-10
7 202041041525-COMPLETE SPECIFICATION [24-09-2020(online)].pdf 2020-09-24
8 202041041525-CLAIMS [07-10-2022(online)].pdf 2022-10-07
8 202041041525-FORM-26 [10-10-2024(online)].pdf 2024-10-10
8 202041041525-Proof of Right [25-09-2020(online)].pdf 2020-09-25
9 202041041525-COMPLETE SPECIFICATION [07-10-2022(online)].pdf 2022-10-07
9 202041041525-FORM-9 [28-11-2020(online)].pdf 2020-11-28
9 202041041525-US(14)-HearingNotice-(HearingDate-17-10-2024).pdf 2024-10-04
10 202041041525-CLAIMS [07-10-2022(online)].pdf 2022-10-07
10 202041041525-FER_SER_REPLY [07-10-2022(online)].pdf 2022-10-07
10 202041041525-OTHERS [25-11-2021(online)].pdf 2021-11-25
11 202041041525-COMPLETE SPECIFICATION [07-10-2022(online)].pdf 2022-10-07
11 202041041525-FORM 18 [25-11-2021(online)].pdf 2021-11-25
11 202041041525-FORM-26 [07-10-2022(online)].pdf 2022-10-07
12 202041041525-EVIDENCE FOR REGISTRATION UNDER SSI [25-11-2021(online)].pdf 2021-11-25
12 202041041525-FER_SER_REPLY [07-10-2022(online)].pdf 2022-10-07
12 202041041525-OTHERS [07-10-2022(online)].pdf 2022-10-07
13 202041041525-FORM-26 [07-10-2022(online)].pdf 2022-10-07
13 202041041525-FER.pdf 2022-04-07
13 202041041525-EDUCATIONAL INSTITUTION(S) [25-11-2021(online)].pdf 2021-11-25
14 202041041525-EDUCATIONAL INSTITUTION(S) [25-11-2021(online)].pdf 2021-11-25
14 202041041525-FER.pdf 2022-04-07
14 202041041525-OTHERS [07-10-2022(online)].pdf 2022-10-07
15 202041041525-EVIDENCE FOR REGISTRATION UNDER SSI [25-11-2021(online)].pdf 2021-11-25
15 202041041525-FER.pdf 2022-04-07
15 202041041525-OTHERS [07-10-2022(online)].pdf 2022-10-07
16 202041041525-EDUCATIONAL INSTITUTION(S) [25-11-2021(online)].pdf 2021-11-25
16 202041041525-FORM 18 [25-11-2021(online)].pdf 2021-11-25
16 202041041525-FORM-26 [07-10-2022(online)].pdf 2022-10-07
17 202041041525-EVIDENCE FOR REGISTRATION UNDER SSI [25-11-2021(online)].pdf 2021-11-25
17 202041041525-FER_SER_REPLY [07-10-2022(online)].pdf 2022-10-07
17 202041041525-OTHERS [25-11-2021(online)].pdf 2021-11-25
18 202041041525-COMPLETE SPECIFICATION [07-10-2022(online)].pdf 2022-10-07
18 202041041525-FORM 18 [25-11-2021(online)].pdf 2021-11-25
18 202041041525-FORM-9 [28-11-2020(online)].pdf 2020-11-28
19 202041041525-CLAIMS [07-10-2022(online)].pdf 2022-10-07
19 202041041525-OTHERS [25-11-2021(online)].pdf 2021-11-25
19 202041041525-Proof of Right [25-09-2020(online)].pdf 2020-09-25
20 202041041525-COMPLETE SPECIFICATION [24-09-2020(online)].pdf 2020-09-24
20 202041041525-FORM-9 [28-11-2020(online)].pdf 2020-11-28
20 202041041525-US(14)-HearingNotice-(HearingDate-17-10-2024).pdf 2024-10-04
21 202041041525-DECLARATION OF INVENTORSHIP (FORM 5) [24-09-2020(online)].pdf 2020-09-24
21 202041041525-FORM-26 [10-10-2024(online)].pdf 2024-10-10
21 202041041525-Proof of Right [25-09-2020(online)].pdf 2020-09-25
22 202041041525-COMPLETE SPECIFICATION [24-09-2020(online)].pdf 2020-09-24
22 202041041525-Correspondence to notify the Controller [10-10-2024(online)].pdf 2024-10-10
22 202041041525-DRAWINGS [24-09-2020(online)].pdf 2020-09-24
23 202041041525-DECLARATION OF INVENTORSHIP (FORM 5) [24-09-2020(online)].pdf 2020-09-24
23 202041041525-FORM 1 [24-09-2020(online)].pdf 2020-09-24
23 202041041525-Written submissions and relevant documents [28-10-2024(online)].pdf 2024-10-28
24 202041041525-DRAWINGS [24-09-2020(online)].pdf 2020-09-24
24 202041041525-MARKED COPIES OF AMENDEMENTS [28-10-2024(online)].pdf 2024-10-28
24 202041041525-POWER OF AUTHORITY [24-09-2020(online)].pdf 2020-09-24
25 202041041525-PROOF OF RIGHT [24-09-2020(online)].pdf 2020-09-24
25 202041041525-FORM 13 [28-10-2024(online)].pdf 2024-10-28
25 202041041525-FORM 1 [24-09-2020(online)].pdf 2020-09-24
26 202041041525-STATEMENT OF UNDERTAKING (FORM 3) [24-09-2020(online)].pdf 2020-09-24
26 202041041525-POWER OF AUTHORITY [24-09-2020(online)].pdf 2020-09-24
26 202041041525-AMMENDED DOCUMENTS [28-10-2024(online)].pdf 2024-10-28
27 202041041525-PROOF OF RIGHT [24-09-2020(online)].pdf 2020-09-24
27 202041041525-PatentCertificate07-04-2025.pdf 2025-04-07
28 202041041525-STATEMENT OF UNDERTAKING (FORM 3) [24-09-2020(online)].pdf 2020-09-24
28 202041041525-IntimationOfGrant07-04-2025.pdf 2025-04-07

Search Strategy

1 SearchStrategyE_07-04-2022.pdf

ERegister / Renewals

3rd: 03 Jul 2025

From 24/09/2022 - To 24/09/2023

4th: 03 Jul 2025

From 24/09/2023 - To 24/09/2024

5th: 03 Jul 2025

From 24/09/2024 - To 24/09/2025

6th: 03 Jul 2025

From 24/09/2025 - To 24/09/2026