Abstract: The present disclosure provides a system (100) for detection, tracking and elimination of locust swarm. The system includes one or more image capturing units (102) configured to detect locust swarm. The system includes one or more spray tanks (114) configured to store pesticides for killing detected locusts. The system includes one or more actuators (106) configured to provide motion to one or more mobile aerial vehicles coupled to the system, the one or more image capturing units and the one or more spray tanks. The system includes one or more keys (112) facilitating selection of one or more operations from a set of predefined functionalities of one or more processing units (108). The one or more processing units are configured to receive information from one or more sensors (104) and transmit information to one or more display units (110), the information pertaining to a set of attributes and the one or more operations.
The present disclosure relates to the field of pest control. In particular, the present disclosure provides a system for detection, tracking and elimination of locust swarm.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present disclosure. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed disclosure, or that any publication specifically or implicitly referenced is prior art. [0003] Locust swarms are a big threat to crops and agricultural land. Drone based risk assessment and abatement of swarming pests have been disclosed in existing literature. However the concerned literature does not disclose unsupervised automatic detection and locomotion to the infested region followed by spraying of pesticide without relying on pest emissions. Automatic detection of a target having specific attributes and pesticide dispensing mechanism has also been discussed in another literature. However detection of moving swarms of pest and direction of motion of the moving swarm has not been disclosed in the existing literature. Another literature discusses pest identification and trapping method in crop fields, but the pests may not be in large population or swarms. In the existing literature, detection of morphological details of the pest is in focus. The task is carried out under supervision of a human agent and the method does not describe automatic mode of operation.
[0004] Hence, there is need in the art to develop a system that can automatically detect large population of locusts irrespective of their mobility or stationary behavior using vision based techniques. The proposed system can be operated in supervised and unsupervised mode. The proposed system can automatically locate and locomote to pest infested areas and dispense pesticide until the visually detected proportion of dead pests falls below a predetermined threshold.
OBJECTS OF THE PRESENT DISCLOSURE
[0005] Some of the objects of the present disclosure, which at least one
embodiment herein satisfies are as listed herein below.
[0006] It is an object of the present disclosure to provide a system for
detection and tracking of locust swarm, the swarm being in any or a combination
of moving, stationary living or dead state.
[0007] It is an object of the present disclosure to provide a system for
detection and tracking of locust swarm that enables one or more image capturing
units coupled to one or more aerial mobile vehicles to capture a set of image
signals from one or more predetermined regions.
[0008] It is an object of the present disclosure to provide a system for
elimination of locust swarm that enables one or more spray tanks coupled to the
one or more aerial vehicles to store biopesticides adapted for killing the detected
locusts.
[0009] It is an object of the present disclosure to provide a system for
detection, tracking and elimination of locust swarm that enables one or more
actuators to provide motion to the one or more mobile aerial vehicles.
[0010] It is an object of the present disclosure to provide a system for
detection, tracking and elimination of locust swarm that enables one or more
actuators to facilitate movement of the one or more image capturing units.
[0011] It is an object of the present disclosure to provide a system for
detection, tracking and elimination of locust swarm that enables one or more
actuators to control flow of biopesticides stored in the one or more spray tanks.
[0012] It is an object of the present disclosure to provide a system for
detection, tracking and elimination of locust swarm that enables one or more
sensors coupled to the one or more mobile aerial vehicles to detect a first set of
attributes related to a set of predefined functionalities of the system.
[0013] It is an object of the present disclosure to provide a system for
detection, tracking and elimination of locust swarm that enables the one or more
image capturing units to determine a second set of attributes pertaining to the
locust swarm.
[0014] It is an object of the present disclosure to provide a system for detection, tracking and elimination of locust swarm that enables one or more keys to receive user inputs pertaining to selection of one or more operations pertaining to the set of predefined functionalities of the system.
[0015] It is an object of the present disclosure to provide a system for detection, tracking and elimination of locust swarm that enables one or more display units to display any or a combination of information pertaining to the set of predefined functionalities, status related to the one or more sensors, the one or more actuators, the one or more spray tanks and the one or more keys. [0016] It is an object of the present disclosure to provide a system for detection, tracking and elimination of locust swarm that enables one or more processing units to receive one or more data packets from the one or more keys, the one or more sensors and the one or more image capturing units. [0017] It is an object of the present disclosure to provide a system for detection, tracking and elimination of locust swarm that enables the one or more processing units to perform the set of operations and transmit one or more data packets to the one or more display units and activate the one or more actuators. [0018] It is an object of the present disclosure to provide a system for detection, tracking and elimination of locust swarm that enables the one or more actuators coupled to the mobile aerial vehicle to facilitate translational and rotational motion of the mobile aerial vehicle along one or more axes.
SUMMARY
[0019] The present disclosure relates to the field of pest control. In particular, the present disclosure provides a system for detection, tracking and elimination of locust swarm.
[0020] An aspect of the present disclosure is to provide a system that may be configured to detect, track and eliminate locust swarm, the swarm being in any or a combination of moving, stationary living or dead state.
[0021] In an aspect, the system may include one or more image capturing
units, one or more sensors, one or more actuators and one or more spray tanks
coupled to a mobile aerial vehicle.
[0022] In an aspect, the system may include one or more keys, one or more
display units and one or more processing units communicatively coupled with
each other and to the one or more image capturing units, the one or more sensors,
and the one or more actuators.
[0023] In an aspect, the system may enable the one or more image capturing
units to capture a set of image signals from one or more predetermined regions,
the set of image signals pertaining to detection and tracking of locust swarms.
[0024] In an aspect, the system may enable the one or more spray tanks to
store biopesticides adapted for killing the detected locusts.
[0025] In an aspect, the system may enable the one or more actuators to
provide motion to the one or more mobile aerial vehicles.
[0026] In an aspect, the system may enable the one or more actuators to
facilitate movement of the one or more image capturing units.
[0027] In an aspect, the system may enable the one or more actuators to
control flow of biopesticides stored in the one or more spray tanks.
[0028] In an aspect, the system may enable the one or more sensors to detect a
first set of attributes related to a set of predefined functionalities of the system.
[0029] In an aspect, the first set of attributes may pertain to the one or more
mobile aerial vehicles, the one or more spray tanks and a predefined
neighborhood of the system.
[0030] In an aspect, the system may enable the one or more image capturing
units to determine a second set of attributes pertaining to the locust swarm.
[0031] In an aspect, the system may enable the one or more keys to receive
user inputs pertaining to selection of one or more operations related to the set of
predefined functionalities of the system.
[0032] In an aspect, the one or more operations may correspond to activation
of the one or more actuators, the one or more sensors and the one or more image
capturing units.
[0033] In an aspect, the system may enable the one or more display units to
display any or a combination of information pertaining to the set of predefined
functionalities, status related to the one or more sensors, the one or more
actuators, the one or more spray tanks and the one or more keys.
[0034] In an aspect, the system may enable the one or more processing units
to receive one or more data packets from the one or more keys, the one or more
sensors and the one or more image capturing units.
[0035] In an aspect, the system may enable the one or more processing units
to perform the set of operations and transmit one or more data packets to the one
or more display units and activate the one or more actuators.
[0036] In an aspect, the system may enable the one or more actuators coupled
to the mobile aerial vehicle to facilitate translational and rotational motion of the
mobile aerial vehicle along one or more axes.
[0037] In an aspect, the system may include one or more power supply units
that may be configured to provide electric power.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
[0038] The accompanying drawings are included to provide a further
understanding of the present disclosure, and are incorporated in and constitute a
part of this specification. The drawings illustrate exemplary embodiments of the
present disclosure and, together with the description, serve to explain the
principles of the present disclosure.
[0039] The diagrams described herein are for illustration only, which thus are
not limitations of the present disclosure, and wherein:
[0040] FIG. 1 illustrates exemplary block diagram (100) of the proposed
system for detection, tracking and elimination of locust swarm, to elaborate upon
its working in accordance with an embodiment of the present disclosure.
[0041] FIG. 2 illustrates exemplary functional components (200) of a
processing unit (108) of the proposed system for detection, tracking and
elimination of locust swarm n, in accordance with an embodiment of the present
disclosure.
[0042] FIG. 3A-3B illustrates exemplary views (300) of proposed system for detection, tracking and elimination of locust swarm, in accordance with an embodiment of the present disclosure.
[0043] FIG. 4 illustrates an exemplary computer system (400) to implement detection and tracking functionalities of the proposed system (100), in accordance with embodiments of the present disclosure.
DETAILED DESCRIPTION
[0044] In the following description, numerous specific details are set forth in
order to provide a thorough understanding of embodiments of the present
invention. It will be apparent to one skilled in the art that embodiments of the
present invention may be practiced without some of these specific details.
[0045] If the specification states a component or feature "may", "can",
"could", or "might" be included or have a characteristic, that particular component
or feature is not required to be included or have the characteristic.
[0046] As used in the description herein and throughout the claims that
follow, the meaning of "a," "an," and "the" includes plural reference unless the
context clearly dictates otherwise. Also, as used in the description herein, the
meaning of "in" includes "in" and "on" unless the context clearly dictates
otherwise.
[0047] While embodiments of the present invention have been illustrated and
described in the accompanying drawings, the embodiments are offered only in as
much detail as to clearly communicate the disclosure and are not intended to limit
the numerous equivalents, changes, variations, substitutions and modifications
falling within the spirit and scope of the present disclosure as defined by the
appended claims.
[0048] The present disclosure relates to the field of pest control. In particular,
the present disclosure provides a system for detection, tracking and elimination of
locust swarm.
[0049] FIG. 1 illustrates exemplary block diagram (100) of the proposed system for detection, tracking and elimination of locust swarm, to elaborate upon its working in accordance with an embodiment of the present disclosure. [0050] In an embodiment, the system for detection, tracking and elimination of locust swarm (100) (interchangeably known as the system (100), herewith) includes one or more image capturing units (102) that may be configured to capture a set of image signals of the locust swarm. In an embodiment, the one or more image capturing units (102) may pertain to a webcam, a CMOS camera, a thermal camera, an infrared camera, a red-green-blue-depth (RGBD) camera, embedded camera and the likes. The set of image signals may be in format like but not limited to joint photographic experts group, portable network graphics, BitMap image, tagged image file format and the likes. The set of image signals may contain information pertaining to any or a combination of severity of locust attack, extent of damages to crops, locations of attack, direction of movement of the swarm, status of mobility of the swarm, concentration of locusts at one or more locations, and status of active infection.
[0051] In an embodiment, the one or more image capturing units (102) may be coupled to one or more mobile aerial vehicles. By way of example, the one or mobile aerial vehicles may include multi-rotor drones, fixed wing drones, single rotor helicopter, fixed wing hybrid vertical takeoff and lift vehicles, non-combat large drones, GPS drones, photography drones and the likes. [0052] In an embodiment, the system (100) may include one or more spray tanks (114) that may be coupled to the one or more mobile aerial vehicles. The one or more spray tanks (114) may be configured to store pesticides that may be dispensed on detection of locust swarm by the one or more image capturing units (102). The pesticides may be in the form of any or a combination of liquid, emulsion, solution, chemical and bio-product. The one or more spray tanks (114) may be configured to store pesticides of predetermined quantities/volume based on the payload handling capacity of the one or more mobile aerial vehicles. [0053] In an embodiment, the system (100) may include one or more actuators (106) that may be coupled to the one or more mobile aerial vehicles. The one or
more actuators (106) may also be coupled to the one or more image capturing units (102) and the one or more spray tanks (114). In an embodiment, the one or more actuators (106) may be configured to facilitate any or a combination of translational and rotational motion of the one or mobile aerial vehicles, pan-tilt movement of the one or more image capturing units (102), and flow control of the pesticide stored in the one or more spray tanks (114).
[0054] In an embodiment, the one or more actuators (106) coupled to the mobile aerial vehicle may pertain to translational motion of the one or more mobile aerial vehicles along x, y and z axis and rotation along vertical and horizontal axes. The one or more actuators (106) coupled to the one or more spray tanks (114) may pertain to control of any or a combination of pressure, volume, spread and repetition rate of dispensing pesticide. The one or more actuators (106) coupled to the one or more image capturing units (102) may pertain to any or a combination of pan and tilt movements of the base accommodating the one or more image capturing units (102). By way of example the one or more actuators may include elements like but not limited to electric motors, direct current motors, permanent magnet motors, stepper motors, servo motors, linear actuators, pressure regulators, spring based pesticide dispensers, pneumatic actuators and hydraulic actuators.
[0055] In an embodiment, the system may include one or more sensors (104) that may be coupled to the one or more mobile aerial vehicles. The one or more sensors (104) may be enabled to detect a first set of attributes pertaining to the one or more mobile aerial vehicles, the one or more spray tanks (114) and environment of the system (100). In an exemplary embodiment, the first set of attributes may be related to a set of predefined functionalities of the system (100). The environment may correspond to a predetermined range of distance surrounding the system (100). By way of example, the one or more sensors (104) may be configured to detect obstacles in a predetermined neighborhood of the one or more mobile aerial vehicles. The first set of attributes may include non-limiting examples like speed, global position, acceleration, heading, rates of roll, pitch,
yaw, distance from obstacles, level of pesticide in the one or more spray tanks and pressure setting of the pesticide dispenser.
[0056] In an embodiment, the one or more sensors (104) may include range sensors, level sensors, position sensors, speed sensors, acceleration sensors and the likes. By way of example the one or more sensors may include global positioning system (GPS) unit, odometers, lidars, infrared sensors, laser sensors, ultrasonic sensors, float sensors, magnetic sensors, pressure sensors, accelerometers, magnetometers, gyroscopes, tilt sensors and current sensors. [0057] In an embodiment, the system (100) may include one or more keys (112) that may be configured to receive user inputs pertaining to selection of one or more operations from the set of predefined functionalities of the system (100). The one or more operations may correspond to activation of the one or more actuators (106), the one or more sensors (104) and the one or more image capturing units (102). In an embodiment, the system (100) may be configured to perform the one or more operations automatically. In another embodiment, the system (100) may be enabled to perform the one or more operations with a user's supervision and control. In an embodiment the one or more keys (112) may include input interfaces like but not limited to tact switches, push switches, slide switches, rotary switches, touch panels, touchpads, scroll switches, switches with touch sensitive housings and joysticks.
[0058] In an embodiment, the system (100) may include one or more display units (110) that may be configured to display any or a combination of information pertaining to the set of predefined functionalities associated with the first set of attributes, status related to the one or more sensors (104), the one or more actuators (106), the one or more spray tanks (114) and the one or more keys (112). By way of example, the one or more display units may include liquid crystal displays (LCD), light emitting diodes (LED), flashing displays, scrolling displays, alphanumeric displays, computer monitors, smartphone displays, tablet PC displays and the likes.
[0059] In an embodiment, the system (100) may include one or more processing units (108) that may be communicatively coupled to the one or more
sensors (104), the one or more image capturing units (102), the one or more actuators (106), the one or more keys (112) and the one or more display units (110). The one or more processing units (108) may be enabled to receive one or more input sets of data packets from the one or more sensors (104), the one or more image capturing units (102) and the one or more keys (112). The one or more processing units (108) may be configured to generate one or more output sets of data packets pertaining to the one or more operations corresponding to the set of predefined functionalities of the system (100) and transmit the one or more output sets of data packets to the one or more display units (110) and the one or more actuators (106).
[0060] In an embodiment, the system (100) may include a power supply unit that may be coupled to the one or more processing units (108), the one or more sensors (104), the one or more image capturing units (102), the one or more actuators (106), the one or more keys (112) and the one or more display units (110). The power supply unit may be configured to provide electric power to the system (100), the power supply unit including any or a combination of batteries, inverters and power lines. The electric power provided by the power supply unit may be of the form of any or a combination of direct current, alternating current, solar current and wind current. By way of example, the power supply unit may include batteries of the like but not limited to Lithium polymer, Lithium Ion, Vanadium Redox and Nickel Cadmium.
[0061] FIG. 2 illustrates exemplary functional components (200) of a processing unit (108) of the proposed system for detection, tracking and elimination of locust swarm, in accordance with an embodiment of the present disclosure.
[0062] In an embodiment, the one or more processing units (108) may include one or more processor(s) (202). The one or more processor(s) (202) may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) (202) may be configured to fetch and
execute computer-readable instructions stored in a memory (204) of the one or more processing units (108). The memory (204) may store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory (204) may comprise any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like. [0063] In an embodiment, the one or more processing units (108) may also comprise an interface(s) (206). The interface(s) (206) may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) (206) may facilitate communication of the one or more processing units (108) among themselves and with various devices coupled to the system (100) such as the one or more sensors (104), the one or more image capturing units (102), the one or more actuators (106), the one or more display units (110) and the one or more keys (112) and the likes. The interface(s) (206) may also provide a communication pathway for one or more components of the processing units (108). Examples of such components include, but are not limited to, memory (204) and the database (210). [0064] In an embodiment, the processing engine(s) (208) of the one or more processing units (108) may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) (208). In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) (208) may be processor-executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) (208) may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) (208). In such examples, the processing unit of the computing device (108) may comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or
the machine-readable storage medium may be separate but accessible to the one or more processing unit (108) and the processing resource. In other examples, the processing engine(s) (208) may be implemented by electronic circuitry. [0065] In an embodiment, the interface (206) may be coupled to any or a combination of Wireless local area network (WLAN), Wide area network (WAN), Wireless fidelity (Wi-fi), Worldwide interoperability for microwave access (WiMAX), cellular communication network, Internet, and the likes. The communication network may be a wireless network, a wired network or a combination thereof that may be implemented as one of the different types of networks, such as Intranet, Local Area Network (LAN), Wide Area Network (WAN), Internet, and the likes. Further, the communication network may either be a dedicated network or a shared network. The shared network may represent an association of the different types of networks that may use variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP) and the likes.
[0066] In an embodiment, the processing engine (208) may include a location determination unit (210) that may be configured to determine the location of the system (100) and the one or more mobile aerial vehicles coupled to the system (100). The location may be retrieved from GPS unit of the one or more sensors (104). The location information may be received as an eighth set of data packets that may be in the form of computer readable binary stream. The location information may be used to steer the one or more mobile aerial vehicles to one or more predefined regions or regions facing severe locust swarm attack, based on user inputs.
[0067] In an embodiment, the processing engine (208) may include an image acquisition unit (210) that may be enabled to receive a fourth set of data packets pertaining to the set of image signals related to the locust swarm. The fourth set of data packets may be obtained from the one or more image capturing units (102). [0068] In an embodiment, the processing engine (208) may include a extraction unit (210) that may be used to extract a fifth set of data packets from
the fourth set of data packets, the fourth set of data packets corresponding to the set of image signals and received from the one or more image capturing units (102). The extraction unit (210) may also enable in extracting a ninth set of data packets from the eighth set of data packets, the ninth set of data packets pertaining to a first set of attributes related to the set of predefined functionalities of the system (100) and received from the one or more sensors (104). The extraction unit may also enable in extracting a second set of data packets from the first set of data packets, the first set of data packets pertaining to a selection of the one or more operations pertaining to the set of predefined functionalities of the system (100) and received from the one or more keys (112).
[0069] In an embodiment, the processing engine (208) may include a comparison unit (210) that may be used to compare the second, fifth and ninth set of data packets correspondingly with the third, sixth and tenth set of data packets. The second set of data packets may pertain to the set of predefined functionalities of the device (100). The sixth set of data packets may pertain to threshold values of a second set of attributes related to the locust swarm. The tenth set of data packets may pertain to threshold values of the first set of attributes. In an embodiment, the third, sixth and tenth set of data packets may be stored in a database (224) operatively coupled to the one or more processors (202). In an embodiment, the second, third, fifth, sixth, ninth and tenth set of data packets may be in the form of computer readable binary stream.
[0070] In an embodiment, the second set of attributes of the locust swarm may include any or a combination of severity of locust attack, extent of damages to crops, locations of attack, direction of movement of the swarm, status of mobility of the swarm, concentration of locusts at one or more locations, status of active infection and the likes.
[0071] In an embodiment, the first set of attributes detected by the one or more sensors (104) may include any or a combination of location of the mobile aerial vehicle, obstacles in path of motion of the mobile aerial vehicle, amount of pesticide remaining in the spray tank, current pressure setting of the pesticide sprayer and the likes.
[0072] In an embodiment, the processing engine (208) may include an image analysis unit (210) that may be configured to generate a seventh set of data packets upon comparing the fifth and the sixth set of data packets. The seventh set of data packets may pertain to the one or more operations related to the set of functionalities of the system (100), the set of functionalities corresponding to the second set of attributes. By way of example, the seventh set of data packets may pertain to determination of proportion of dead and alive locusts, direction of movement of the swarm, concentration of locust swarm in a predefined region, position of a stationary locust swarm and the likes.
[0073] In an embodiment, the processing engine (208) may include an instruction unit (210). In an embodiment, based on the third, seventh and the eleventh set of data packets, the one or more actuators (116), the one or more sensors (104) and the one or more display units (110) may be activated by the instruction unit (210). The one or more operations related to activation of the one or more actuators (106) may include translational and rotational locomotion of the one or more mobile aerial vehicles from the current position to the infested region, avoidance of obstacles present within a predetermined surrounding of the one or more mobile aerial vehicles, control of flow of pesticides from the pesticide dispensers coupled to the one or more spray tanks (114), pan and tilt movement of the one or more image capturing units (102) and the likes. The processing engine (208) may be enabled to generate a twelfth set of data packets from the third, seventh and the eleventh set of data packets. The twelfth set of data packets may be in computer readable binary format.
[0074] In an embodiment, the instruction unit (210) may receive a third set of data packets pertaining to the first set of data packets received from the one or more keys (112). The first set of data packets may be electrical signals in computer readable format. The third set of data packets may pertain to selection of the one or more operations from the set of functionalities of the system (100), the selection being made by the user. In another embodiment, the instruction unit (210) may be enabled to automatically generate the third set of data packets without user inputs. In an embodiment, the one or more processors (202) may
generate an eleventh set of data packets upon comparison of the ninth and the tenth set of data packets. The eleventh set of data packets may be based on the current status of the one or more sensors (104). The eleventh set of data packets may be associated with locomotion, image capturing and pesticide dispensing. In an embodiment, the instruction unit (210) may be configured to transmit any or a combination or a fraction of the third, seventh, eleventh and the twelfth set of data packets to the one or more display units.
[0075] In an embodiment, the processing engine (208) may include other units (210) that may be configured to implement functionalities that supplement actions performed by the one or more processors (202) of the processing unit (108). In an exemplary embodiment, such actions may include noise removal from the captured images detected by the one or more image capturing units (102), auto calibration of the one or more sensors (104), detection of malfunctioning components and correspondingly generation of alarm signals and the likes. [0076] FIG. 3A-3B illustrates exemplary views (300) of proposed system for detection, tracking and elimination of locust swarm, in accordance with an embodiment of the present disclosure.
[0077] In an embodiment, the system (100) for detection, tracking and elimination of locust swarm may include one or more image capturing units (102) configured to detect locust swarm. The system may include one or more spray tanks (114) configured to store pesticides for killing detected locusts. The system (100) may include one or more actuators (106A-106G) configured to provide motion to one or more mobile aerial vehicles (drone) coupled to the system (100), the one or more image capturing units (102) and the one or more spray tanks (114). The system may include one or more processing units (not shown) that may be configured to perform one or more operations pertaining to a set of predefined functionalities. The system may include one or more keys (not shown) facilitating selection of the one or more operations (not shown). The system may include one or more sensors (104A-104C) and one or more display units (not shown). The one or more processing units may be configured to receive a first set information from the one or more sensors and transmit a second set information to the one or more
display units. The first and the second set of information may pertain to a predefined set of attributes and the one or more operations performed by the one or more processing units.
[0078] FIG. 4 illustrates an exemplary computer system (400) to
implement detection and tracking functionalities of the proposed system (100), in accordance with embodiments of the present disclosure.
[0079] In an illustrative embodiment of FIG. 4, a computer system may
include an external storage device (410), a bus (420), a main memory (430), a
read only memory (440), a mass storage device (450), communication port (460),
and a processor (470). A person skilled in the art may appreciate that computer
system may include more than one processor and communication ports. Examples
of processor (470) may include, but not limited to, an Intel® Itanium® or Itanium
2 processor(s), or AMD® Opteron® or Athlon MP® processor(s), Motorola®
lines of processors, FortiSOC™ system on a chip processors or other future
processors. Processor (470) may include various modules associated with
embodiments of the present invention. Communication port (460) may be any of
an RS-232 port for use with a modem based dialup connection, a 10/100 Ethernet
port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel
port, or other existing or future ports. Communication port (460) may be chosen
depending on a network, such a Local Area Network (LAN), Wide Area Network
(WAN), or any network to which computer system connects.
[0080] In an embodiment, Memory (430) may be Random Access
Memory (RAM), or any other dynamic storage device commonly known in the art. Read only memory (440) may be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or BIOS instructions for processor (470). Mass storage (450) may be any current or future mass storage solution, which may be used to store information and/or instructions. Exemplary mass storage solutions may include, but not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial Bus (USB) and/or
Firewire interfaces), e.g. those available from Seagate (e.g., the Seagate Barracuda
7102 family) or Hitachi (e.g., the Hitachi Deskstar 7K1000), one or more optical
discs, Redundant Array of Independent Disks (RAID) storage, e.g. an array of
disks (e.g., SATA arrays), available from various vendors including Dot Hill
Systems Corp., LaCie, Nexsan Technologies, Inc. and Enhance Technology, Inc.
[0081] In an embodiment, Bus (420) may enable the processor(s) (470) to
communicatively couple with the memory, storage and other blocks. Bus (420)
may be, e.g. a Peripheral Component Interconnect (PCI) / PCI Extended (PCI-X)
bus, Small Computer System Interface (SCSI), USB or the like, for connecting
expansion cards, drives and other subsystems as well as other buses, such a front
side bus (FSB), which may connect processor (470) to software system.
[0082] Optionally, operator and administrative interfaces, e.g. a display,
keyboard, and a cursor control device, may also be coupled to bus (420) to support direct operator interaction with computer system. Other operator and administrative interfaces may be provided through network connections connected through communication port (460). External storage device (410) may be any kind of external hard-drives, floppy drives, IOMEGA® Zip Drives, Compact Disc -Read Only Memory (CD-ROM), Compact Disc - Re-Writable (CD-RW), Digital Video Disk - Read Only Memory (DVD-ROM). Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system limit the scope of the present disclosure.
[0083] As used herein, and unless the context dictates otherwise, the term "coupled to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously. Within the context of this document terms "coupled to" and "coupled with" are also used euphemistically to mean "communicatively coupled with" over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.
[0084] The terms, descriptions and figures used herein are set forth by way of illustration only. Many variations are possible within the spirit and scope of the subject matter, which is intended to be defined by the following claims and their equivalents in which all terms are meant in their broadest reasonable sense unless otherwise indicated.
[0085] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE INVENTION
[0086] The present disclosure provides for a system for detection and tracking of
locust swarm, the swarm being in any or a combination of moving, stationary
living or dead state.
[0087] The present disclosure provides for a system for detection and tracking
of locust swarm that enables one or more image capturing units coupled to one or
more aerial mobile vehicles to capture a set of image signals from one or more
predetermined regions.
[0088] The present disclosure provides for a system for elimination of locust
swarm that enables one or more spray tanks coupled to the one or more aerial
vehicles to store biopesticides adapted for killing the detected locusts.
[0089] The present disclosure provides for a system for detection, tracking
and elimination of locust swarm that enables one or more actuators to provide
motion to the one or more mobile aerial vehicles.
[0090] The present disclosure provides for a system for detection, tracking
and elimination of locust swarm that enables one or more actuators to facilitate
movement of the one or more image capturing units.
[0091] The present disclosure provides for a system for detection, tracking
and elimination of locust swarm that enables one or more actuators to control flow
of biopesticides stored in the one or more spray tanks.
[0092] The present disclosure provides for a system for detection, tracking
and elimination of locust swarm that enables one or more sensors coupled to the
one or more mobile aerial vehicles to detect a first set of attributes related to a set
of predefined functionalities of the system.
[0093] The present disclosure provides for a system for detection, tracking
and elimination of locust swarm that enables the one or more image capturing
units to determine a second set of attributes pertaining to the locust swarm.
[0094] The present disclosure provides for a system for detection, tracking
and elimination of locust swarm that enables one or more keys to receive user
inputs pertaining to selection of one or more operations pertaining to the set of
predefined functionalities of the system.
[0095] The present disclosure provides for a system for detection, tracking
and elimination of locust swarm that enables one or more display units to display
any or a combination of information pertaining to the set of predefined
functionalities, status related to the one or more sensors, the one or more
actuators, the one or more spray tanks and the one or more keys.
[0096] The present disclosure provides for a system for detection, tracking
and elimination of locust swarm that enables one or more processing units to
receive one or more data packets from the one or more keys, the one or more
sensors and the one or more image capturing units.
[0097] The present disclosure provides for a system for detection, tracking
and elimination of locust swarm that enables the one or more processing units to
perform the set of operations and transmit one or more data packets to the one or
more display units and activate the one or more actuators.
[0098] The present disclosure provides for a system for detection, tracking
and elimination of locust swarm that enables the one or more actuators coupled to
the mobile aerial vehicle to facilitate translational and rotational motion of the
mobile aerial vehicle along one or more axes.
We Claim:
1. A system (100) for detection, tracking and elimination of locust swarms, the system comprising:
one or more spray tanks (114), coupled to one or more mobile aerial vehicles, wherein the one or more spray tanks (114) is configured to store pesticide, wherein the pesticide is in the form of any or a combination of liquid, emulsion, solution, chemical and bioproduct54;
one or more image capturing units (102), coupled to the one or more mobile aerial vehicles and enabled to capture a set of image signals pertaining to detection and tracking of locust swarms, wherein the set of image signals are facilitated to be collected from one or more predetermined regions based on user inputs;
one or more sensors (104), coupled to said one or more mobile aerial vehicles and enabled to determine a first set of attributes pertaining to the one or more mobile aerial vehicles, the one or more spray tanks (114) and environment of the system (100), wherein the first set of attributes are related to a set of predefined functionalities of the system (100) and wherein the environment corresponds to a predetermined range of distance surrounding the system (100);
one or more actuators (106), coupled to said one or more mobile aerial vehicles, the one or more image capturing units and the one or more spray tanks and configured to facilitate any or a combination of motion of the one or mobile aerial vehicles, movement of the one or more image capturing units (102), and flow control of the pesticide stored in the one or more spray tanks (114);
one or more keys (112), adapted to receive user inputs pertaining to selection of one or more operations from the set of predefined functionalities of the system (100), wherein the one or more operations correspond to activation of the one or more actuators (106), the one or more sensors (104) and the one or more image capturing units (102);
one or more display units (110), configured to display any or a combination of information pertaining to the set of predefined functionalities associated with the first set of attributes, status related to the one or more sensors (104), the one or more actuators (106), the one or more spray tanks (114) and the one or more keys
(H2);
one or more processing units (108), communicatively coupled to the one or more sensors (104), the one or more image capturing units (102), the one or more actuators (106), the one or more keys (112) and the one or more display units (110), wherein the processing unit comprises one or more processors associated with a memory, the memory storing instructions executable by the one or more processors and configured to:
receive a first set of data packets from the one more keys (112), wherein the first set of data packets pertain to selection of one or more operations related to the set of predefined functionalities of the device (100);
compare the first set of data packets with a second set of data packets and correspondingly generate a third set of data packets, wherein the second set of data packets pertain to the set of predefined functionalities of the device (100) and wherein the second set of data packets are stored in a database, operatively coupled to the one or more processors;
receive a fourth set of data packets from the one or more image capturing units (102) and correspondingly extract a fifth set of data packets, wherein the fourth set of data packets correspond to the set of image signals;
compare the extracted fifth set of data packets with a sixth set of data packets and correspondingly generate a seventh set of data packets, wherein the sixth set of data packets pertain to threshold values of a second set of attributes related to the locust swarm and wherein the sixth set of data packets are stored in the database;
receive an eighth set of data packets from the one or more sensors (104) and correspondingly extract a ninth set of data packets, wherein the ninth set of data packets pertain to the first set of attributes;
compare the ninth set of data packets with a tenth set of data packets and correspondingly generate an eleventh set of data packets, wherein the tenth set of data packets pertain to threshold values of the first set of attributes and wherein the tenth set of data packets are stored in the database;
generate a twelfth set of data packets using a Machine Learning based processing engine, upon receiving the third, seventh and the eleventh set of data packets, wherein the twelfth set of data packets pertain to the set of operations performed by the system (100);
transmit any or a fraction or a combination of the
third, seventh, eleventh and the twelfth set of data packets
to the one or more display units (110).
2. The system (100) as claimed in claim 1, wherein the fifth set of data
packets pertain to the second set of attributes of the locust swarm,
including any or a combination of severity of locust attack, extent of
damages to crops, locations of attack, direction of movement of the swarm, status of mobility of the swarm, concentration of locusts at one or more locations, and status of active infection.
3. The system (100) as claimed in claim 1, wherein the first set of attributes detected by the one or more sensors (104) include any or a combination of location of the mobile aerial vehicle, obstacles in path of motion of the mobile aerial vehicle, amount of pesticide remaining in the spray tank, current pressure setting of pesticide dispenser.
4. The system (100) as claimed in claim 1, wherein activation of the one or more actuators (106) coupled to the mobile aerial vehicle pertain to translational motion of the mobile aerial vehicle along x, y and z axis and rotation along vertical and horizontal axes.
5. The system (100) as claimed in claim 1, wherein activation of the one or more actuators (106) coupled to the one or more spray tanks (114) pertain to control of any or a combination of pressure, volume, spread and repetition rate of dispensing pesticide.
6. The system (100) as claimed in claim 1, wherein activation of the one or more actuators (106) coupled to the one or more image capturing units (102) pertain to any or a combination of pan and tilt movements.
7. The system (100) as claimed in claim 1, wherein the set of operations associated with the twelfth set of data packets pertain to any or a combination of activation and control of the one or more actuators (106), the one or more sensors (104), the one or more image capturing units (102) and the one or more display units (110) and transmission of information among the one or more actuators, the one or more sensors, the one or more image capturing units and the one or more display units.
8. The system (100) as claimed in claim 1, wherein the one or more processing units (108), the one or more sensors (104), the one or more image capturing units (102), the one or more actuators (106), the one or more keys (112) and the one or more display units (110) are operatively coupled to one or more power supply units, wherein the one or more
power supply units are configured to provide electric power to the system (100), wherein the one or more power supply unit include any or a combination of batteries, inverters and power lines and wherein, the electric power is in the form of any or a combination of direct current, alternating current, solar current and wind current.
| # | Name | Date |
|---|---|---|
| 1 | 202111032675-STATEMENT OF UNDERTAKING (FORM 3) [20-07-2021(online)].pdf | 2021-07-20 |
| 2 | 202111032675-POWER OF AUTHORITY [20-07-2021(online)].pdf | 2021-07-20 |
| 3 | 202111032675-FORM FOR STARTUP [20-07-2021(online)].pdf | 2021-07-20 |
| 4 | 202111032675-FORM FOR SMALL ENTITY(FORM-28) [20-07-2021(online)].pdf | 2021-07-20 |
| 5 | 202111032675-FORM 1 [20-07-2021(online)].pdf | 2021-07-20 |
| 6 | 202111032675-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-07-2021(online)].pdf | 2021-07-20 |
| 7 | 202111032675-EVIDENCE FOR REGISTRATION UNDER SSI [20-07-2021(online)].pdf | 2021-07-20 |
| 8 | 202111032675-DRAWINGS [20-07-2021(online)].pdf | 2021-07-20 |
| 9 | 202111032675-DECLARATION OF INVENTORSHIP (FORM 5) [20-07-2021(online)].pdf | 2021-07-20 |
| 10 | 202111032675-COMPLETE SPECIFICATION [20-07-2021(online)].pdf | 2021-07-20 |
| 11 | 202111032675-Proof of Right [22-07-2021(online)].pdf | 2021-07-22 |
| 12 | 202111032675-FORM 18 [09-05-2023(online)].pdf | 2023-05-09 |