Abstract: The present disclosure provides a pest trapper drone apparatus. The disclosed apparatus can includea camera for capturing atleast one image of an area in vicinity of the apparatus,a control unit operatively coupled with the camera, the control unit configured to extract one or more attributes from the captured atleast one image wherein one or more attributes pertaining to information associated with a pest. Select a first set of attributes from the extracted one or more attributes to determine the pest in the captured image, and a second set of attributes are selected from the extracted one or more attributes to determine position of the pest on a surface,generate one or more control signals for navigating the apparatus from an initial position to a position in vicinity of the detected pest to facilitate trapping operation by the apparatus.
The present invention is related topest trapping apparatuses. More particularly,
the present invention is related to drones that are capable of trapping pests.
BACKGROUND
[002] Background description includes information that may be useful in
understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the present invention, or that any publication specifically or implicitly referenced is prior art.
[003] Pests are living organism which causes damage to human beings, human
habitats, plant, animals, and livestock. Pests cause troublesome for farmers by parasitizing
livestock or feeding on crops in any or combination of boll weevil on cotton, codling moth on
apples, and pests are carrier of harmful germs in human structures. For example, pest such as
rats causes plague diseases. Ticks are carrier of Lyme disease, and mosquitos cause malaria,
lizards carry the germs, and bacteria like salmonella that can cause serious illness in people.In
few cases, severe infection may cause death as it can spread from intestines to the
bloodstream, and then to other body parts. Eatables and beverages get contaminated, become
poisonous when pest move over eatables or accidentally fall into liquids such as water or
milk,when consumed may cause loss of human life. The pests at home can be hazardous,
create panic in humans especially in women and children, and may lead to severe injuries in
humans while running away and colliding with objects due to anxiety and fear. Further, the
pest can bite and cause serious skin infection while one tries to capture them manually.
[004] Traditionally,traps for individual pest were used to capture the specified pest
and directly reduces the population by typically using food, chemical attractants, and visual lures such as bright light to attract the pest without causing any harm with least human intervention.Pest traps are sometimes used in pest management program instead of pesticides but are often used at seasonal and distributional patterns of pest occurrence. Various type of traps includes light trap, adhesive trap, flying insect traps, terrestrial arthropod traps, aquatic arthropod traps, kill traps. For example, flies are being attracted by protein, tephritid flies are effectively attracted by synthetic attractants, and mosquitoes get attracted by floral fragrances, carbon dioxide, bright colours,fruity fragrances, lactic acid, warmth, and moisture.
[005] In prior art technique, pest trap problem was tried to resolve by smell sensor-
based pest trapper. A sensor sense smell of the pest, transmits signal to the control unit such
that the control unit by comparing with predefined environmental readings sends signal to
trapper for capturing the pest. Major disadvantage of the invention was disclosed trapper
could not climb high rise wall, where generally pests like mosquitos, lizards are found.
[006] In another existing technique, a trap with electric power supply is disclosed.
The trap configured to initiate test activation comprises an electric power supply and
electronic control unit, gets activated automatically according to pre-set timerand
programmed according to instructions received from user. This trap regularly needs human
intervention, as user needs to give the instructions to be performed by the trap, thereby due to
human disclosed trap a time-consuming. Furthermore, there is a need to improve operational
availability and service life of components used in disclosed traps and dispensing devices.
[007] There is, therefore, a need in the art to provide a pest trapper drone apparatus
that overcome the above-mentioned and other limitations of the existing solutions and utilize techniques, which are robust, accurate, fast, efficient, cost-effective and simple.
OBJECTS OF THE PRESENT DISCLOSURE
[008] It is a general object of the present invention to provide a pest trap drone
apparatus that is portable and convenient to handle.
[009] It is another object of the present invention to provide a pest trap drone
apparatus that do not require any human intervention.
[0010] It is yet another object of the present invention to provide a pest trap drone
apparatus with automatic detection and capturing of pest.
[0011] It is yet another object of the present invention to provide a pest trap drone
apparatus with an efficient and cost-effective solution.
SUMMARY
[0012] The present invention is related to drones. More particularly, the present
invention is related to drones that are capable of trapping pests.
[0013] According to an aspect, a pest trapper drone apparatus, said apparatus
including a camera for capturing image of an area in vicinity of the apparatus, a control unit operatively coupled with the camera, the control unit configured to extract one or more attributes from the captured atleast one image wherein one or more attributes pertains to information associated with a pest, select a first set of attributes from the extracted one or
more attributes to determine the pest in the captured image, and a second set of attributes are
selected from the extracted one or more attributes to determine position of the pest on a
surface, generate one or more control signals for navigating the apparatus from an initial
position to a position in vicinity of the detected pest to facilitate trapping operation by the
apparatus.
[0014] In an aspect, laser device operatively configured with the apparatus such that
the laser device based on the second set of attributes releases high-intensity green laser
reduces light conditions enables incapacitate of the pest.
[0015] In an aspect, moveable arm configured at a predefined position of the
apparatus, wherein the moveable arm based on the second set of attributes moveable arm
elongates thereby traps the pest.
[0016] In an aspect, the top end of moveable arm is fitted with a trapper for capturing
the pest without harming them.
[0017] In an aspect, electromagnetic platform coupled with pre-defined position of the
ceiling, such that the apparatus is at the initial position with the electromagnetic platform.
[0018] In an aspect, the electromagnetic platform comprises proximity sensor
operatively coupled with a control unit to detect a position of apparatus, and generate signal
to start current supply such that the apparatus comes back to the initial position.
[0019] In an aspect, bin positioned proximally to the electromagnetic platform enables
the moveable arm to dispose pest to the bin.
[0020] In an aspect, the bin comprises sensor operatively coupled with a control unit
to detect the movable arm, generate signal enables opening of bin such that the trapper
disposes pest inside the bin.
[0021] In an aspect, the pest can be any or a combination of lizard, mosquitos, ant,
and rat.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The accompanying drawings are included to provide a further understanding
of the present disclosure, and are incorporated in and constitute a part of this specification.
The drawings illustrate exemplary embodiments of the present disclosure and, together with
the description, serve to explain the principles of the present disclosure. The diagrams are for
illustration only, which thus is not a limitation of the present disclosure.
[0023] FIG. 1A shows an image illustrating a perspective view of a pest trap drone
apparatus, in accordance with an exemplary embodiment of the present invention.
[0024] FIG. IB shows an image providinga perspective view of the electromagnetic
platformand the bin, in accordance with exemplary embodiment of the present invention.
DETAILED DESCRIPTION
[0025] The following is a detailed description of embodiments of the disclosure
depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0026] In the following description, numerous specific details are set forth in order to
provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practised without some of these specific details.
[0027] Embodiments of the present invention include various steps, which will be
described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, and firmware and/or by human operators.
[0028] If the specification states a component or feature "may", "can", "could", or
"might" be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0029] As used in the description herein and throughout the claims that follow, the
meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.
[0030] Exemplary embodiments will now be described more fully hereinafter with
reference to the accompanying drawings, in which exemplary embodiments are shown. These exemplary embodiments are provided only for illustrative purposes and so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those of ordinary skill in the art. The invention disclosed may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Various modifications will be readily apparent to persons skilled in the art. The general
principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Moreover, all statements herein reciting embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure). Also, the terminology and phraseology usedfor the purpose of describing exemplary embodiments and should not be considered limiting. Thus, the present invention is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed. For the purpose of clarity, details relating to technical material that is known in the technical fields related to the invention have not been described in detail so as not to unnecessarily obscure the present invention.
[0031] Thus, for example, it will be appreciated by those of ordinary skill in the art
that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named element.
[0032] Systems depicted in some of the figures may be provided in various
configurations. In some embodiments, the systems may be configured as a distributed system where one or more components of the system are distributed across one or more networks in a cloud computing system.
[0033] Each of the appended claims defines a separate invention, which for
infringement purposes is recognized as including equivalents to the various elements or limitations specified in the claims. Depending on the context, all references below to the "invention" may in some cases refer to certain specific embodiments only. In other cases, it
will be recognized that references to the "invention" will refer to subject matter recited in one
or more, but not necessarily all, of the claims.
[0034] All methods described herein may be performed in any suitable order unless
otherwise indicated herein or otherwise clearly contradicted by context. The use of any and
all examples, or exemplary language (e.g., "such as") provided with respect to certain
embodiments herein is intended merely to better illuminate the invention and does not pose a
limitation on the scope of the invention otherwise claimed. No language in the specification
should be construed as indicating any non-claimed element essential to the practice of the
invention.
[0035] Various terms as used herein are shown below. To the extent a term used in a
claim is not defined below, it should be given the broadest definition persons in the pertinent
art have given that term as reflected in printed publications and issued patents at the time of
filing.
[0036] The present invention is related to drones. More particularly, the present
invention is related to drones that are capable of trapping pests.
[0037] According to an aspect, a pest trapper drone apparatus, said apparatus
including a camera for capturing image of an area in vicinity of the apparatus, a control unit
operatively coupled with the camera, the control unit configured to extract one or more
attributes from the captured at least one image wherein one or more attributes pertains to
information associated with a pest, select a first set of attributes from the extracted one or
more attributes to determine the pest in the captured image, and a second set of attributes are
selected from the extracted one or more attributes to determine position of the pest on a
surface, generate one or more control signals for navigating the apparatus from an initial
position to a position in vicinity of the detected pest to facilitate trapping operation by the
apparatus.
[0038] In an aspect, laser device operatively configured with the apparatus such that
the laser device based on the second set of attributes releases high-intensity green laser
reduces light conditions enables incapacitate of the pest.
[0039] In an aspect, moveable arm configured at a predefined position of the
apparatus, wherein the moveable arm based on the second set of attributes moveable arm
elongates thereby traps the pest.
[0040] In an aspect, the top end of moveable arm is fitted with a trapper for capturing
the pest without harming them.
[0041] In an aspect, electromagnetic platform coupled with pre-defined position of the
ceiling, such that the apparatus is at the initial position with the electromagnetic platform.
[0042] In an aspect, the electromagnetic platform comprises proximity sensor
operatively coupled with a control unit to detect the position of apparatus, and the control unit
generate signal to start current supply such that the apparatus comes back to the initial
position.
[0043] In an aspect, bin positioned proximally to the electromagnetic platform
enables the moveable arm to dispose pest to the bin.
[0044] In an aspect, the bin comprises sensor operatively coupled with a control unit
to detect the movable arm, generate signal enables opening of bin such that the trapper
disposes pest inside the bin.
[0045] In an aspect, the pest can be any or a combination of lizard, mosquitos, ant,
and rat.
[0046] FIG. 1A shows an image illustrating a perspective view of a pest trap drone
apparatus, in accordance with an exemplary embodiment of the present invention.
[0047] In an embodiment, pest trapper drone apparatus 100 includes drone 102. The
drone 102 is an unmanned aerial vehicle (UAV) includes a body which is supported by iron
rod stanchion 104, said drone 102 is configured withelectromagnetic platform 124 at an
initial position. The drone 102 includes camera 106, moveable arm 110, laser device 108, and
control unit 116. The camera 106 is operatively coupled with the control unit 116 configured
for receiving images. The control unit 116 is configured to extract one or more attributes of
the images received based on information associated with pest, such that first set of extracted
attributes determines the pest in the captured image with the help of edges detected.
Moreover, second set of attributes determines position of the pest based on the image
captured of the pest, the laser device 108 based on determined position projects pulse light on
detected pest such that it incapacitates the pest. The drone 102 based on the second set of
attributes navigates aerially from initial position to proximal distance of the detected
incapacitate pest. The trapper 114 fitted at top end of the moveable arm 110, said moveable
armllO based on detected position of the pest by the control unit 116 enable increases arm
length thereby trapping the incapacitate pest.The drone 102 navigates back with the
incapacitate pest to its initial position at electromagnetic platform 124, and disposes the
incapacitate pest inside bin 128.
[0048] In an embodiment, the apparatus includes camera 106. Walls and ceilings 122
of enclosed space are being monitored inreal-time by a rotating 360-degree closed-circuit
television (CCTV) camera 106 which is mounted on top surface of the drone 102. The real-time visual imagecaptured by the camera 106 is being transmitted to the control unitl 16 such as raspberry pie 3, such that the control unit 116 determines image of pest out of the visual image based on its predefined edges, identify the position of the pest based on image processing and machine learning techniques. In an exemplary embodiment, the camera 106 continuously gives images / videos of the enclosed area, such that if any pest roams on wall or ceiling 122 of the enclosed space, the control unit 116 identifies the pest and gives signal to drone 102 for further operations.
[0049] In an embodiment, the apparatus 100 includes laser device 108.A lightweight
laser device 108 is fitted on the top of drone 102 which projectsa laser beam on the pest to incapacitate it for some time, and thelaser device 108 is controlled by the control unit 116 such as laser controller module. The laser device 108 based on the second set of attributes, identify position of the pest, said laser device releases high-intensity green laser reduces light condition enables incapacitate of any living pest without any contact. In an exemplary embodiment, the laser device 108 includes an electro-laser uses ultraviolet laser beams of around 193 nm, and capable of immobilizingany living pest targets at a distance without contact.
[0050] In an embodiment, the apparatus 100 includes moveable arm 110. The
moveable arm 110 includes trapper 114 and telescopic hand 112 operatively coupled with the control unit 116, receives signal after incapacitation ofpest by laser device 108. The moveable arm 110 is configured at center of the drone 102 enables trapping the pest based on signals received from control unit 116. Moreover, the telescopic handll2 can bend itself based on position of the pest, and the telescopic hand 112 enables to trap the pest from very narrow or complicated area of walls or ceilings. The moveable arm 110 based on the second set of attributes identify the pest thereby elongates arm of telescopic hand 112, the top end of the telescopic hand 112 is operatively coupled with trapper 114. The trapper 114 without causing any harm captures the incapacitate pest to dispose it into bin 128, thus avoiding any human intervention.
[0051] FIG. IB shows an image providinga perspective view of electromagnetic
platform 124, in accordance with exemplary embodiment of the present invention.
[0052] In an embodiment, the apparatus 100 includes electromagnetic platform 124.
The electromagnetic platform 124 includes proximity sensor 118, electronic switch 120 configured with the ceiling 122 of an enclosed space to park drone 102. The drone 102 is operatively coupled with the electromagnetic platform 124 at an initial position to enable the
drone 102 to come back proximal to electromagnetic platform 124 after its aerial navigation operations are complete.The electromagnetic platform 124 generates an electromagnetic field, enable pulls the drone 102 to hold tightly when an electric current is supplied to it through the electronic switch 120. When the current supply to the platform is cut, it releases the drone 102 from its magnetic pull to enable aerial navigation of drone 102 towards the pest. When the drone 102 reaches just below the platform (after capturing the pest) then the proximity sensor 118configured with control unit 116 will switch ON electronic switch 120 to park the drone 102 at its initial position.
[0053] In an embodiment, the apparatus 100 includes bin 128. Sensors are operatively
configured with the bin 128 includes lid 126 at its top edge, whenever the moveable arm 110 comes proximal to the bin 128, the sensors operatively coupled with the control unit 116 gets activated allowing the bin 128 to open the lid 126. The telescopic hand 112 with trapper 114,disposes pest inside the bin 128 enabling close of bin lid 126 automatically with the help of sensors. The bin 128 is temporally fixed with ceiling 122 such that it can easily be detached for transferring the pest to its appropriate place or may also hand overto the forest department.
[0054] In an exemplary embodiment, the enclosed area includes a lizard at any of
sidewalk The camera 106 fixed to the drone 102 continuously transfer real-time images or videos to the raspberry pie 3 116. The raspberry pie 3 116 continuously with the help of its image processing and machine learning monitor the live visual image. In an embodiment, the lizard is detected, and position of the lizard is determined on the sidewall enabling raspberry pie3 116 to send signalto the laser device 108 and drone 102 for aerial navigation proximal to the lizard. The laser device 108 with its high-intensity green light focuses on the lizard, thereby said green light enables dizziness of lizard for some time.The drone 102 parked at the electromagnetic platform 124, based on received signals from raspberry pie 3 116 takes aerial route to vicinity of the lizard. The telescopic hand 112 elongates from top end, trapper 114 positions itself on the incapacitate lizard to capture the lizard without causing any damage or harm to the lizard. The drone 102 with trapped lizard through aerial navigation comes proximal to the electromagnetic platform 124, where proximity sensor 118 switchesON current supply enables the drone 102 to be parked at its initial position. The moveable arm 110 comes proximal to the bin 128, enabling the lid 126 of the bin 128 to be opened, thereby the trapper 114 disposes the incapacitate lizard safely into the bin 128 without any human intervention.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0055] The present disclosure provides a pest trap drone apparatus that is portable and
convenient to handle.
[0056] The present disclosure provides a pest trap drone apparatus that do not require
any human intervention.
[0057] The present disclosure provides a pest trap drone apparatus with automatic
detection and capturing of lizard.
[0058] The present disclosure provides a pest trap drone apparatus with efficient and
cost-effective solution.
We Claim:
1.A pest trapper drone apparatus (100), said apparatus (100) comprising:
a camera (106) for capturing atleast one image of an area in vicinity of the apparatus (100);
a control unit (116) operatively coupled with the camera (106), the control unit (116) configured to
extract one or more attributes from the captured atleast one image wherein one or more attributes pertains to information associated with a pest;
select a first set of attributes from the extracted one or more attributes to determine the pest in the captured image, and a second set of attributes are selected from the extracted one or more attributes to determine position of the pest on a surface;
generate one or more control signals for navigating the apparatus (100) from an initial position to a position in vicinity of the detected pest to facilitate trapping operation by the apparatus (100).
2. The apparatus (100) as claimed in claim 1, wherein laser device (108) operatively configured with the apparatus (100) such that the laser device (108) based on the second set of attributes releases high-intensity green laser reduces light conditions enables incapacitate of the pest.
3. The apparatus (100) as claimed in claim 1, wherein moveable arm (110) configured at a predefined position of the apparatus (100), wherein the moveable arm (110) based on the second set of attributes moveable arm (110) elongates thereby traps the pest.
4. The apparatus as claimed in claim 3, wherein the top end of moveable arm (110) is fitted with trapper (114) for capturing the pest without harming them.
5. The apparatus as claimed in claim 1, wherein electromagnetic platform (124) coupled with pre-defined position of the ceiling (122), such that the apparatus (100) is at the initial position with the electromagnetic platform (124).
6. The apparatus (100) as claimed in claim 2, wherein the electromagnetic platform (124) comprises proximity sensor (118) operatively coupled with control unit (116) to detect position of apparatus (100), and generate signal to start current supply such that the apparatus (100) comes back to the initial position.
7. The apparatus (100) as claimed in claim 1, wherein bin (128) positioned proximally to the electromagnetic platform (124) enable the moveable arm (110) to dispose pest to the bin (128).
8. The apparatus (100) as claimed in claim 7, wherein the bin (128) comprises sensor operatively coupled with control unit (116) to detect the movable arm (110), generate signal enables opening of bin (128) such that the trapper (114) disposes pest inside the bin (128).
9. The apparatus (100) as claimed in claim 1, wherein the pest can be any or a combination of lizard, mosquitos, ant, and rat.
| # | Name | Date |
|---|---|---|
| 1 | 202011002862-STATEMENT OF UNDERTAKING (FORM 3) [22-01-2020(online)].pdf | 2020-01-22 |
| 2 | 202011002862-FORM FOR STARTUP [22-01-2020(online)].pdf | 2020-01-22 |
| 3 | 202011002862-FORM FOR SMALL ENTITY(FORM-28) [22-01-2020(online)].pdf | 2020-01-22 |
| 4 | 202011002862-FORM 1 [22-01-2020(online)].pdf | 2020-01-22 |
| 5 | 202011002862-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [22-01-2020(online)].pdf | 2020-01-22 |
| 6 | 202011002862-EVIDENCE FOR REGISTRATION UNDER SSI [22-01-2020(online)].pdf | 2020-01-22 |
| 7 | 202011002862-DRAWINGS [22-01-2020(online)].pdf | 2020-01-22 |
| 8 | 202011002862-DECLARATION OF INVENTORSHIP (FORM 5) [22-01-2020(online)].pdf | 2020-01-22 |
| 9 | 202011002862-COMPLETE SPECIFICATION [22-01-2020(online)].pdf | 2020-01-22 |
| 10 | abstract.jpg | 2020-02-01 |
| 11 | 202011002862-FORM-26 [19-03-2020(online)].pdf | 2020-03-19 |
| 12 | 202011002862-Proof of Right [27-06-2020(online)].pdf | 2020-06-27 |
| 13 | 202011002862-FORM 18 [12-11-2021(online)].pdf | 2021-11-12 |
| 14 | 202011002862-FER.pdf | 2022-05-26 |
| 15 | 202011002862-FORM-26 [25-11-2022(online)].pdf | 2022-11-25 |
| 16 | 202011002862-FER_SER_REPLY [25-11-2022(online)].pdf | 2022-11-25 |
| 17 | 202011002862-CORRESPONDENCE [25-11-2022(online)].pdf | 2022-11-25 |
| 18 | 202011002862-COMPLETE SPECIFICATION [25-11-2022(online)].pdf | 2022-11-25 |
| 19 | 202011002862-CLAIMS [25-11-2022(online)].pdf | 2022-11-25 |
| 20 | 202011002862-ABSTRACT [25-11-2022(online)].pdf | 2022-11-25 |
| 21 | 202011002862-PatentCertificate05-03-2024.pdf | 2024-03-05 |
| 22 | 202011002862-IntimationOfGrant05-03-2024.pdf | 2024-03-05 |
| 1 | SSamended202011002862AE_17-02-2023.pdf |
| 2 | SearchStrategy202011002862E_25-05-2022.pdf |