Abstract: An agricultural residues burning detection and reporting system 100 is disclosed. Images of agricultural field may be acquired by sensors 102 such as by camera installing in the field or by installing the camera in an unmanned aerial vehicle (UAV). When the UAV moves over the field, images may be acquired that may be processed by applying machine learning techniques, and upon detection of burning stubble or burnt stubble, concerned authorities such as agricultural department of the area may be notified by transmitting a notification through a communication unit 110 such as Wi-Fi. Also, acquired images may be stored to a sever 108, and concerned authorities may access the stored images anytime to get live information of the field where the UAV is operating.
TECHNICAL FIELD
[0001] The present invention relates to the field of residue controlling in agricultural field. More particularly the present invention relates to a system for detection of agricultural residues burning and reporting to concerned authority automatically.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Various agricultural or other operations may result in residue covering a portion of the area addressed by the operation. In an agricultural setting, for example, residue may include straw, wheat, paddy, corn stalks, or various other types of plant material, which may be either cut or un-cut, and either loose or attached to the ground to varying degrees. Agricultural residue may result, for example, after harvesting and cutting down the corn crop, which may result in residue of various sizes covering the ground to various degrees. In addition to wheat and paddy, sugarcane leaves are most commonly burnt. According to an official report, more than 500 million tonnes of agricultural residues is produced annually in India, and cereal crops (rice, wheat, maize and millets) account for 70 per cent of the total agricultural residue. Here, 34 per cent comes from rice and 22 per cent from wheat crops, most of which is burnt on the agricultural field.
[0004] Burning of crop residue causes damage to other micro-organisms present in the upper layer of the soil as well as its organic quality. Due to the loss of ‘friendly’ pests, the wrath of ‘enemy’ pests has increased and as a result, crops are more prone to disease. The solubility capacities of the upper layers of soil have also been reduced. Instead of burning of the agricultural residue, it may be used in different ways like cattle feed, compost manure, roofing in rural areas, biomass energy, mushroom cultivation, packing materials, fuel, paper, bio-ethanol and industrial production, etc.
[0005] Therefore, there is a need to overcome above-mentioned problems by bringing solution which detects burning residue in agricultural field and automatically notify concerned authorities, thus required action may be taken on time.
OBJECTS OF THE PRESENT DISCLOSURE
[0006] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0007] An object of the present disclosure is to provide a system to detect burning of crop residue.
[0008] Another object of the present disclosure is to provide a system to report concerned authority upon detection of burning residue automatically.
[0009] Another object of the present disclosure is to provide a system to accurately detect burning of crop residue or burnt residue in an agricultural field.
[0010] Another object of the present disclosure is to provide a system which is easy to operate.
SUMMARY
[0011] Various aspects of the present invention relates to the field of residue controlling in agricultural field. More particularly the present invention relates to a system for detection of agricultural crop residues burning and reporting to concerned authority automatically.
[0012] An aspect of the present disclosure pertains to a system for agricultural residue burning detection and reporting, the system may include one or more sensors configured to acquire one or more images of an agricultural field, a location identifier configured to detect location information of the agriculture field, a control unit operatively coupled to the one or more sensors and the location identifier, and configured to execute a set of instructions, stored in a memory, which, on execution, causes the system to receive the acquired one or more images from the one or more sensors, and location information from the location identifier, process the received one or more images to detect agricultural residue in the agricultural field, and generate a notification signal and transmit to one or more mobile computing devices by a communication unit.
[0013] In an aspect, the notification signal may pertain location information of the agricultural field, where the agricultural residue is detected.
[0014] In an aspect, the one or more sensors may include any or a combination of an unmanned aerial vehicle, a camera, a temperature sensor, a flame detector, and a ground scout.
[0015] In an aspect, the agricultural residue may be at least burning stubble and burnt stubble.
[0016] In an aspect, the one or more mobile computing devices may include any or a combination of desktop computer, tablet, personal digital assistant, laptop, and smart phone.
[0017] In an aspect, the acquired images may be transmitted to a server, and the server may be accessed by one or more entities to monitor the acquired images in real time.
[0018] In an aspect, the communication unit may include any or a combination of Wireless Fidelity (Wi-Fi), Worldwide Interoperability for Microwave Access (WiMAX), Bluetooth, Wireless LAN (WLAN), and Wireless USB (Wireless Universal Serial Bus).
[0019] Another aspect of the present disclosure pertains to an unmanned aerial vehicle that may including one or more sensors coupled to the unmanned aerial vehicle, and configured to acquire one or more images of an agricultural field, a location identifier coupled to the unmanned aerial vehicle, and configured to detect location information of the agriculture field, a control unit may be operatively coupled to the one or more sensors and the location identifier, and configured to execute a set of instructions, stored in a memory, which, on execution, causes the unmanned aerial vehicle to receive the acquired one or more images from the one or more sensors, and location information from the location identifier, process the received one or more images to detect crop residue in the agricultural field, generate a notification signal and transmit to one or more mobile computing devices by a communication unit.
[0020] In an aspect, the control unit may be configured to receive instructions from the one or more mobile computing devices, wherein the instructions pertain to route to be covered by the unmanned aerial vehicle.
[0021] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
BRIEF DESCRIPTION OF DRAWINGS
[0022] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure. The diagrams are for illustration only, which thus is not a limitation of the present disclosure.
[0023] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[0024] FIG. 1 illustrates an exemplary block diagram of proposed system to detect agricultural residue detection and reporting, in accordance with an embodiment of the present disclosure.
[0025] FIG. 2 illustrates exemplary functional components of a system to detect agricultural residue detection and reporting, in accordance with an embodiment of the present disclosure.
[0026] FIG. 3 illustrates a flowchart to disclose working of the proposed system, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0027] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the present disclosure as defined by the appended claims.
[0028] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details. Embodiments explained herein relates to the field of residue controlling in agricultural field. In particular the present invention relates to a system for detection of agricultural residues burning and reporting to concerned authority automatically.
[0029] The term “agricultural residue” as used herein refers to burning stubble and burnt stubble.
[0030] Embodiments of a system are described to detect agricultural residue in an agricultural field. After harvesting much agricultural residue is left after all the crops are cut down, the agricultural residue detection is conducted from unmanned aerial vehicle (UAV) and ground scouts. The detected agricultural residue data or information may be transmitted to a cloud server or to a remote mobile computing device so that operators elsewhere can receive the information to make decisions.
[0031] The size and percentage of agricultural residue may vary from location to location even within a single field, depending on factors such as the local terrain and soil conditions of the field, local plant coverage, residue characteristics before the instant tillage (or other) operation, and so on. Agricultural residue on a field is characterized at least by a percent coverage (i.e., a percentage of a given area of ground that is covered by residue) and a characteristic residue size or hardness (e.g., an average, nominal, or other measurement of the length, width or area of particular pieces of residue).
[0032] FIG. 1 illustrates a system 100 for agricultural residues (interchangeably referred as residues, hereinafter) detection and reporting in an agricultural field (interchangeably referred as field, hereinafter). The system 100 can include one or more sensors 102 (collectively referred as sensors 102 and individually referred as sensor 102) configured to acquire one or more images (interchangeably referred as images, hereinafter) of the field, a location identifier 104 configured to detect location information of the field, and a control unit 106 operatively coupled to the sensors 102 and the location identifier 104.
[0033] In an embodiment, the sensors 102 can include any or a combination of an unmanned aerial vehicle (UAV), a camera, a grayscale camera, a high resolution camera, an infrared camera, a temperature sensor, a flame detector, and a ground scout.
[0034] In an embodiment, the UAV 102 can include an embedded camera to acquire images of the field, the UAV can be configured to capture images of a pre-defined area, for example one or two villages. Images captured by the UAV 102 can be transmitted to a control unit 106 and a server 108 through a communication unit 110. In another embodiment, the communication unit 110 can be configured to facilitate wireless Internet technology. Examples of such wireless Internet technology include GSM, Wireless LAN (WLAN), Wireless Fidelity (Wi-Fi), Wi-Fi Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), Worldwide Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), HSUPA (High Speed Uplink Packet Access), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced), and the like.
[0035] In addition, the communication unit 110 can be configured to facilitate short-range communication. For example, short-range communication can be supported using at least one of Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra-Wideband (UWB), ZigBee, Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus), and the like.
[0036] In an embodiment, real time streaming of information (i.e. captured images and videos) can be stored on the server 108, and concerned authorities can view the real time streaming through a mobile computing device such as phone. For example, the communication unit 110 can include a radio to receive data transmitted by modulating a radio frequency (RF) signal from a remote station (not shown). For example, the remote station (not shown) is part of a cellular telephone network and the data can be transmitted according to the long-term evolution (LTE) standard. The communication unit 110 can also transmit data to the remote station (not shown) to achieve bi-directional communications. However, other techniques for transmitting and receiving data may alternately be utilized. For example, the communication unit 110 can achieve bi-directional communications with the over Bluetooth or by utilizing a Wi-Fi standard. In addition, the images captured by the camera of UAV can be downloaded from via a wired connection, USB, etc., upon landing of the UAV.
[0037] In some embodiments, images of the field from can be captured by a camera assembly handled by ground scout or other ground based imaging device. Moreover, the ground scout or ground based imaging device may include any number and configuration of camera assemblies for capturing images of the field. For example, imaging devices can be installed in the field to capture images.
[0038] In an embodiment, the location identifier 104 can be configured to acquire location information of the field, where UAV is moving or capturing images. The location identifier 104 can be can be selected from location sensor, Global Positioning System senso (GPS), and geolocation sensor, that may use GPS technology to determine real- time location information (latitude and longitude) of the captured images by the UAV 102, and the detected location information can be communicated to the control unit 106, and to the server 108. In an embodiment, the location identifier 104 can be configured to location identifier 104 can be selected from location sensor, Global Positioning System sensor.
[0039] In an embodiment, the control unit 106 can be operatively coupled with the sensors 102 and the location identifier 104. The control unit 106 can be configured to execute a set of instructions, stored in a memory, which, on execution, causes the system to receive the acquired one or more images from the sensors 102, and location information from the location identifier 104, process the received one or more images to detect agricultural residues in the agricultural field. Further, the control unit 106 can generate a notification signal and transmit to one or more mobile computing devices (interchangeably referred as mobile computing devices, hereinafter) by a communication unit 110, and the notification signal can pertains location information of the agricultural field, where the residues are detected.
[0040] In an embodiment, the one or more mobile computing devices can include any or a combination of desktop computer, tablet, personal digital assistant, portable media device, laptop, and smart phone, that can be associated with one or more entities such as Ministry of Environment, central government, state government, and agricultural department. The mobile computing device(s) can include any one of a web client or application to facilitate communication and interaction between entities and the system 100. In various embodiments, information communicated between the system 100 and the mobile computing device(s) can involve user- selected functions available through one or more user interfaces (UIs). The UIs may be specifically associated with the web client (e.g., a browser) or the application. Accordingly, during a communication session with the mobile computing device(s), the system 100 may provide the mobile computing device(s) with a set of machine-readable instructions that, when interpreted by the client device using the web client or the application, cause the client device to present the UI, and transmit user input received through such UIs back to the system 100. As an example, the UIs provided to the mobile computing device(s) by the system 100 can allow entities to view real time streaming of the agriculture field, and information shared by the control unit.
[0041] In an embodiment, upon receiving the information of the stubble burning, the agriculture department can take action accordingly to reduce pollution. For example, fine can be imposed on the person burning the stubble (i.e. crop residue).
[0042] Another aspect of the present disclosure relates to an unmanned aerial vehicle including one or more sensors 102 coupled to the unmanned aerial vehicle, and configured to acquire one or more images of an agricultural field, a location identifier 104 coupled to the unmanned aerial vehicle, and configured to detect location information of the agriculture field, and a control unit 106 operatively coupled to the sensors 102 and the location identifier 104.
[0043] In an embodiment, the control unit can be configured to receive instructions from the one or more mobile computing devices, and the instructions include information of route to be covered by the unmanned aerial vehicle. The control unit 106 can be further configured to execute a set of instructions, stored in a memory, which, on execution, causes the UAV to receive the acquired one or more images from the sensors 102, and location information from the location identifier 104, process the received one or more images to detect agricultural residues in the agricultural field. Further, the control unit 106 can generate a notification signal and transmit to one or more mobile computing devices by a communication unit 110, and the notification signal can pertains location information of the agricultural field, where the residues are detected.
[0044] In an embodiment, a power source (not shown) can be provided to supply electricity to the sensors 102, the location identifier 104, and the control unit 106. The power source can provide DC power supply or AC power supply, and can include any or a combination of rechargeable battery, lithium (Li) ion cell, rechargeable cells, electrochemical cells, storage battery, Lithium Polymer, Lithium Ion, Nickel Cadmium, Nickel Hydride and secondary cell.
[0045] According to an embodiment of the present disclosure, as illustrated in FIG. 2, a control unit 106 can include one or more processor(s) 202. The one or more processor(s) 202 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 can be configured to fetch and execute computer readable instructions stored in a memory 204 of the control unit 106. The memory 204 can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 204 can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the likes.
[0046] In an embodiment, the control unit 106 can also include an interface(s) 206. The interface(s) 206 can comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 may facilitate communication of system 100. The interface(s) 206 may also provide a communication pathway for one or more components of the system 100. Examples of such components include, but are not limited to, processing engine(s) 208 and database 210.
[0047] In an embodiment, a processing engine(s) 208 can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 208. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 208 may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 208 may include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) 208. In such examples, the processing unit 106 can include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to control unit 106 and the processing resource. In other examples, the processing engine(s) 208 may be implemented by electronic circuitry. A database 210 can include data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 208.
[0048] In an embodiment, the processing engine(s) 208 can include an image processing unit 212, a matching unit 214, a signal generation unit 216, and other unit(s) 218. The other unit(s) 218 can implement functionalities that supplement applications or functions performed by the system 100 or the processing engine(s) 208.
[0049] It would be appreciated that units being described are only exemplary units and any other unit or sub-unit may be included as part of the system 100. These units too may be merged or divided into super- units or sub-units as may be configured.
[0050] In an embodiment, the image processing unit 212 can be configured to analyse the receive images by applying machine learning techniques to extract burning stubble or burnt stubble images from the received images. In an exemplary embodiment, the control unit 106 can include machine learning techniques such a computer visions to analyze images received from the sensors 104. In an exemplary embodiment, the control unit 106 can use any of a variety of models such as decision trees, linear regression models, logistic regression models, neural networks, classifiers, support vector machines, inductive logic programming, ensembles of models, genetic algorithms, Bayesian networks, etc., and can be trained using a variety of approaches, such as deep learning, association rules, inductive logic, clustering, maximum entropy classification, learning classification, etc. In some examples, control unit 106 may use supervised learning. In some examples, the control unit 106 may use unsupervised learning to analyse the images to detect burning or burnt stubble from the images.
[0051] In an embodiment, the extract burning stubble or burnt stubble images from the received images can be matched with the matching unit 214, the matching unit may compare the extracted images with a set of storing burning stubble or burnt stubble images, and upon detection of burning stubble or burnt stubble beyond a limit (i.e. a large amount of residues), the signal generation unit 216 may generate a notification signal. The notification signal can be transmitted to one or more mobile computing devices such as smart phone through a communication unit 110 such as Wi-Fi. For example, the authorised member of the agricultural department can receive the stubble burning information on his phone and may operate accordingly, by imposing fine on the person burning residues in the field.
[0052] In an embodiment, the captured images can be directly transmitted to a server 108, to enable live streaming of the agricultural field. The UAV can capture images and can store to the server 108, and the server 108 can be accessed by one or more entities, such as agricultural department, state government and the likes by the mobile computing device. The UAV can be controlled by the agricultural department, by setting its route to move and capture images.
[0053] With reference to FIG. 3, a flowchart illustrates a method of working of the proposed system. At 302 images of an agricultural field can be acquired by sensors 104 or the UAV 104. At 304, the acquired images can be transmitted to a server 108, and simultaneously transmitted to a control unit 106 (at 306) through a communication unit 110 such as Wi-Fi. At 308, the images can be processed to detect burning of stubble or burnt stubble in the field, and step 310, upon detection of burning or burnt stubble, the information can be transmitted to one or more mobile computing devices to notify concerned authority, for example, agricultural department.
[0054] The above described features, configurations, effects, and the like are included in at least one of the embodiments of the present invention, and should not be limited to only one embodiment. In addition, the features, configurations, effects, and the like as illustrated in each embodiment may be implemented with regard to other embodiments as they are combined with one another or modified by those skilled in the art. Thus, content related to these combinations and modifications should be construed as including in the scope and spirit of the invention as disclosed in the accompanying claims.
[0055] Further, although the embodiments have been mainly described until now, they are just exemplary and do not limit the present invention. Thus, those skilled in the art to which the present invention pertains will know that various modifications and applications which have not been exemplified may be performed within a range which does not deviate from the essential characteristics of the embodiments. For instance, the constituent elements described in detail in the exemplary embodiments can be modified to be performed. Further, the differences related to such modifications and applications shall be construed to be included in the scope of the present invention specified in the attached claims.
[0056] The present invention encompasses various modifications to each of the examples and embodiments discussed herein. According to the invention, one or more features described above in one embodiment or example can be equally applied to another embodiment or example described above. The features of one or more embodiments or examples described above can be combined into each of the embodiments or examples described above. Any full or partial combination of one or more embodiment or examples of the invention is also part of the invention.
ADVANTAGES OF THE INVENTION
[0057] The proposed invention provides a system to detect burning of agricultural residue in agricultural field.
[0058] The proposed invention provides a system to report concerned authority upon detection of burning residue automatically.
[0059] The proposed invention provides a system to accurately detect burning of crop residue or burnt residue in an agricultural field.
[0060] The proposed invention provides a system which is easy to operate.
We Claims:
1. A system 100 for agricultural residue burning detection and reporting, the system comprising:
one or more sensors 102 configured to acquire one or more images of an agricultural field;
a location identifier 104 configured to detect location information of the agriculture field;
a control unit 106 operatively coupled to the one or more sensors 102 and the location identifier 104, and configured to execute a set of instructions, stored in a memory, which, on execution, causes the system to:
receive the acquired one or more images from the one or more sensors, and location information from the location identifier;
process the received one or more images to detect agricultural residue in the agricultural field; and
generate a notification signal and transmit to one or more mobile computing devices by a communication unit 110, wherein the notification signal pertains location information of the agricultural field, where the agricultural residue is detected.
2. The system as claimed in claim 1, wherein the one or more sensors comprise any or a combination of an unmanned aerial vehicle, a camera, a temperature sensor, a flame detector, and a ground scout.
3. The system as claimed in claim 1, wherein the agricultural residue is at least burning stubble and burnt stubble.
4. The system as claimed in claim 1, wherein the one or more mobile computing devices comprise any or a combination of desktop computer, tablet, personal digital assistant, laptop, and smart phone.
5. The system as claimed in claim 1, wherein the acquired images are transmitted to a server 108, wherein the server is accessed by one or more entities to monitor the acquired images in real time.
6. The system as claimed in claim 1, wherein the communication unit comprises any or a combination of Wireless Fidelity (Wi-Fi), Worldwide Interoperability for Microwave Access (WiMAX), Bluetooth, Wireless LAN (WLAN), and Wireless USB (Wireless Universal Serial Bus).
7. An unmanned aerial vehicle comprising:
one or more sensors coupled to the unmanned aerial vehicle, and configured to acquire one or more images of an agricultural field;
a location identifier coupled to the unmanned aerial vehicle, and configured to detect location information of the agriculture field; and
a control unit operatively coupled to the one or more sensors and the location identifier, and configured to execute a set of instructions, stored in a memory, which, on execution, causes the unmanned aerial vehicle to:
receive the acquired one or more images from the one or more sensors, and location information from the location identifier;
process the received one or more images to detect crop residue in the agricultural field;
generate a notification signal and transmit to one or more mobile computing devices by a communication unit, wherein the notification signal pertains location information of the agricultural field, where the residue is detected.
8. The unmanned aerial vehicle as claimed in claim 1, wherein the one or more sensors comprise any or a combination of a camera, a temperature sensor, and a flame detector.
9. The unmanned aerial vehicle as claimed in claim 1, wherein the residue is at least burning stubble and burnt stubble.
10. The unmanned aerial vehicle as claimed in claim 1, wherein the control unit is configured to receive instructions from the one or more mobile computing devices, wherein the instructions pertain to route to be covered by the unmanned aerial vehicle.
| # | Name | Date |
|---|---|---|
| 1 | 202211002337-STATEMENT OF UNDERTAKING (FORM 3) [14-01-2022(online)].pdf | 2022-01-14 |
| 2 | 202211002337-POWER OF AUTHORITY [14-01-2022(online)].pdf | 2022-01-14 |
| 3 | 202211002337-FORM FOR STARTUP [14-01-2022(online)].pdf | 2022-01-14 |
| 4 | 202211002337-FORM FOR SMALL ENTITY(FORM-28) [14-01-2022(online)].pdf | 2022-01-14 |
| 5 | 202211002337-FORM 1 [14-01-2022(online)].pdf | 2022-01-14 |
| 6 | 202211002337-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [14-01-2022(online)].pdf | 2022-01-14 |
| 7 | 202211002337-EVIDENCE FOR REGISTRATION UNDER SSI [14-01-2022(online)].pdf | 2022-01-14 |
| 8 | 202211002337-DRAWINGS [14-01-2022(online)].pdf | 2022-01-14 |
| 9 | 202211002337-DECLARATION OF INVENTORSHIP (FORM 5) [14-01-2022(online)].pdf | 2022-01-14 |
| 10 | 202211002337-COMPLETE SPECIFICATION [14-01-2022(online)].pdf | 2022-01-14 |
| 11 | 202211002337-Proof of Right [16-05-2022(online)].pdf | 2022-05-16 |
| 12 | 202211002337-FORM-9 [31-10-2022(online)].pdf | 2022-10-31 |
| 13 | 202211002337-FORM 18 [16-10-2023(online)].pdf | 2023-10-16 |
| 14 | 202211002337-FER.pdf | 2025-03-26 |
| 15 | 202211002337-FORM 3 [26-06-2025(online)].pdf | 2025-06-26 |
| 16 | 202211002337-FORM-5 [24-09-2025(online)].pdf | 2025-09-24 |
| 17 | 202211002337-FORM-26 [24-09-2025(online)].pdf | 2025-09-24 |
| 18 | 202211002337-FER_SER_REPLY [24-09-2025(online)].pdf | 2025-09-24 |
| 19 | 202211002337-DRAWING [24-09-2025(online)].pdf | 2025-09-24 |
| 20 | 202211002337-CORRESPONDENCE [24-09-2025(online)].pdf | 2025-09-24 |
| 21 | 202211002337-CLAIMS [24-09-2025(online)].pdf | 2025-09-24 |
| 1 | SearchStrategyMatrixE_09-11-2024.pdf |