Abstract: The present disclosure provides a system for picking objects from an area of interest, said system includes: a processing unit configured to: receive one or more first images of an area of interest; detect one or more objects to be picked up from the area of interest based on processing of the received one or more first images; and determine location of the one or more objects in the area of interest using a global positioning system (GPS); an unmanned aerial vehicle (UAV) operatively coupled to the processing unit, said UAV comprising: a navigation unit configured to determine a path for the UAV from a current location of the UAV to the determined location of the one or more objects, wherein the current location of the UAV is obtained from a GPS module operatively coupled to the UAV; and a visual unit configured to capture a plurality of second images at the area of interest, from one or more sensors operatively coupled to it, when the UAV reaches said area of interest, wherein said visuals unit detects the presence of the one or more object at the area of interest based on the processing of the plurality of captured second images, and wherein the UAV deploys a pick-up means to pick up the one or more objects.
TECHNICAL FIELD
[001] The present disclosure relates to system for picking objects. More particularly, the present disclosure relates to an unmanned aerial vehicle and system for picking and collecting objects.
BACKGROUND
[002] The background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[003] In developing countries, a complex infrastructure does not exist for routine waste collection or collection/separation of recyclable items. As such, it is common for waste to pile up in locations where high densities of people reside. Furthermore, the recyclables are typically not separated from the trash, which can both add to the volume of trash, and add burdens to public health.
[004] In various developing countries such as Indian, Kenya, etc. generally waste collection happens after several complaints have been filled by residents or after an incident has occurred. Once reported, collection of the waste is often slow, sometimes due to socio-economic issues, due to behaviour issues, or due to infrastructural challenges. Several millions of dollars have been spent by donors and NGOs (non-governmental organizations) in partnership with local governments to fix the problem. However, the situation remains unresolved, in part due to the rapid population growth in certain locations of these countries.
[005] As a result, waste collection services may be limited to individual collectors manually picking up waste from households and businesses, which is at times not possible due to poor road infrastructure and weather conditions, e.g., the yearly floods in Nairobi.
[006] Furthermore, traditional waste collection systems used in developed economies, such as garbage trucks and formal sewer networks, are difficult to build for low-income and similar areas in developing cities, largely due to unplanned, rapid urbanization.
[007] Trash in Oceans and on the land, surface is a major concern nowadays. There are 5.25 trillion pieces of plastic debris in the ocean and the land. Of that mass, 269,000 tons float on the surface, while some four billion plastic microfibers per square kilometre litter the deep sea. [Source: www.nationalgeographic.com ].The garbage floating in rivers like Ganga 2
and Yamuna is also the major problem that India is facing. The trash in water bodies especially the plastic floating on water bodies affects the aquatic animals. The aquatic species extinction rate is also increasing due to it.Due to densely populated nature of some areas, it is difficult for garbage collection trucks to access the collection locations causing a huge sanitation problem. In particular, during raining season it is almost impossible to access such locations.
[008] There is, therefore, a need in the art to provide an unmanned aerial vehicle and system for picking and collecting the objects that overcome the above-mentioned and other limitations of the existing solutions and utilize techniques, which are robust, accurate, fast, efficient, cost effective and simple to implement.
OBJECTS OF THE PRESENT DICLOSURE
[009] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0010] It is an object of the present disclosure to provide an unmanned aerial vehicle and system for picking and collecting objects.
[0011] It is another object of the present disclosure to provide an unmanned aerial vehicle and system for picking and collecting objects that doesn’t require human intervention for functioning of the system
[0012] It is another object of the present disclosure to provide an unmanned aerial vehicle and system for picking and collecting objects that can enable collecting of objects laying at unreachable location.
[0013] It is another object of the present disclosure to provide an unmanned aerial vehicle and system for picking and collecting objects that can be used for picking and collecting garbage from land as well as water surface.
[0014] It is another object of the present disclosure to provide an unmanned aerial vehicle and system for picking and collecting objects that is time efficient and robust.
[0015] It is another object of the present disclosure to provide an unmanned aerial vehicle and system for picking and collecting objects that is cost efficient and easy to implement.
3
SUMMARY
[0016] The present disclosure relates to system for picking objects. More particularly, the present disclosure relates to an unmanned aerial vehicle and system for picking and collecting objects.
[0017] According to an aspect, a system for picking objects from an area of interest is disclosed. The system can include: a processing unit configured to: receive one or more first images of an area of interest; detect one or more objects to be picked up from the area of interest based on processing of the received one or more first images; and determine location of the one or more objects in the area of interest using a global positioning system (GPS); an unmanned aerial vehicle (UAV) operatively coupled to the processing unit, the UAV comprising: a navigation unit configured to determine a path for the UAV from a current location of the UAV to the determined location of the one or more objects, wherein the current location of the UAV is obtained from a GPS module operatively coupled to the UAV; and a visual unit configured to capture a plurality of second images at the area of interest, from one or more sensors operatively coupled to it, when the UAV reaches the area of interest, wherein the visuals unit detects the presence of the one or more object at the area of interest based on the processing of the plurality of captured second images, and wherein the UAV deploys a pick-up means to pick up the one or more objects.
[0018] In an embodiment, the pickup means comprises one or more jaws for picking up the one or more objects, and wherein each of the one or more jaws having a first set of sensors, the first set of sensors operatively coupled with a control unit, the control unit configured to determine first weight of the one or more objects.
[0019] In an embodiment, the control unit is configured to compare the determined first weight and a first pre-defined weight, and based on comparison if the determined first weight is less than the first pre-defined weight then the UAV deploys the one or more jaws to pick up the one or more objects and put the one or more objects in a first container coupled with the UAV.
[0020] In an embodiment, based on the comparison if the determined first weight is more than the first pre-defined weight then the control unit generates an alert signal indicating overweight condition.
[0021] In an embodiment, a second set of sensors coupled to the first container for determining second weight of the first container, and wherein the control unit operatively coupled with the second set of sensors configured to compare the determined second weight with a second pre-defined weight. 4
[0022] In an embodiment, if the second weight is more than the second pre-defined weight the UAV navigates to a second container and empties contents of the first container to the second container.
[0023] In an embodiment, the second container is associated with a first GPS unit to enable the UAV to monitor real-time location of the second container.
[0024] In another aspect, an unmanned aerial vehicle (UAV) is disclosed. The UAV ca be used for picking up one or more objects from an area of interest, the UAV comprising: a processing unit configured on the UAV and operatively coupled to it, the processing unit configured to: receive one or more first images of an area of interest; detect one or more objects to be picked up from the area of interest based on processing of the received one or more first images; and determine location of the one or more objects in the area of interest using a global positioning system (GPS); a GPS module configured on the UAV and operatively coupled with it, to detect current location of the UAV, wherein the UAV is operable to pick up the one or more objects based on a signal received from a control unit configured on the UAV, the control unit configured to: determine a path for the UAV from a current location of the UAV to the determined location of the one or more objects, wherein the current location of the UAV is obtained from a GPS module operatively coupled to the UAV; and capture a plurality of second images at the area of interest, from one or more sensors operatively coupled to it, when the UAV reaches the area of interest, wherein the visuals unit detects the presence of the one or more object at the area of interest based on the processing of the plurality of captured second images, and wherein the UAV deploys a pick-up means to pick up the one or more objects.
[0025] In an embodiment, the pickup means comprises one or more jaws, wherein each of the or more jaws having a first set of sensors, the first set of sensors operatively coupled with a control unit, the control unit configured to determine weight of the one or more objects.
[0026] In an embodiment, the control unit configured to compare the determined first weight and a first pre-defined weight, and based on comparison: if the determined first weight is less than the first pre-defined weight then the UAV deploys the one or more jaws to pick up the one or more objects and put the one or more objects in a first container coupled with the UAV; and if the determined first weight is more than the first pre-defined weight then the control unit generates an alert signal indicating overweight condition.
5
BRIEF DESCRIPTION OF THE DRAWINGS
[0027] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[0028] FIG. 1 illustrates an exemplary block diagram representation of a system for picking and collecting objects in accordance with an embodiment of the present disclosure.
[0029] FIG. 2A illustrates an exemplary representation of UAV picking objects in accordance with an embodiment of the present disclosure.
[0030] FIG. 2B illustrates an exemplary representation of collecting of objects in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0031] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[0032] Embodiments of the present invention include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, firmware and/or by human operators.
[0033] Embodiments of the present invention may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for 6
storing electronic instructions (e.g., computer programming code, such as software or firmware).
[0034] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present invention with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present invention may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the invention could be accomplished by modules, routines, subroutines, or subparts of a computer program product.
[0035] If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0036] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0037] Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. These embodiments are provided so that this invention will be thorough and complete and will fully convey the scope of the invention to those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure).
[0038] While embodiments of the present invention have been illustrated and described, it will be clear that the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the invention, as described in the claim. 7
[0039] The present disclosure relates to system for picking objects. More particularly, the present disclosure relates to an unmanned aerial vehicle and system for picking and collecting objects.
[0040] According to an aspect, a system for picking objects from an area of interest is disclosed. The system can include: a processing unit configured to: receive one or more first images of an area of interest; detect one or more objects to be picked up from the area of interest based on processing of the received one or more first images; and determine location of the one or more objects in the area of interest using a global positioning system (GPS); an unmanned aerial vehicle (UAV) operatively coupled to the processing unit, the UAV comprising: a navigation unit configured to determine a path for the UAV from a current location of the UAV to the determined location of the one or more objects, wherein the current location of the UAV is obtained from a GPS module operatively coupled to the UAV; and a visual unit configured to capture a plurality of second images at the area of interest, from one or more sensors operatively coupled to it, when the UAV reaches the area of interest, wherein the visuals unit detects the presence of the one or more object at the area of interest based on the processing of the plurality of captured second images, and wherein the UAV deploys a pick-up means to pick up the one or more objects.
[0041] In an embodiment, the pickup means comprises one or more jaws for picking up the one or more objects, and wherein each of the one or more jaws having a first set of sensors, the first set of sensors operatively coupled with a control unit, the control unit configured to determine first weight of the one or more objects.
[0042] In an embodiment, the control unit is configured to compare the determined first weight and a first pre-defined weight, and based on comparison if the determined first weight is less than the first pre-defined weight then the UAV deploys the one or more jaws to pick up the one or more objects and put the one or more objects in a first container coupled with the UAV.
[0043] In an embodiment, based on the comparison if the determined first weight is more than the first pre-defined weight then the control unit generates an alert signal indicating overweight condition.
[0044] In an embodiment, a second set of sensors coupled to the first container for determining second weight of the first container, and wherein the control unit operatively coupled with the second set of sensors configured to compare the determined second weight with a second pre-defined weight. 8
[0045] In an embodiment, if the second weight is more than the second pre-defined weight the UAV navigates to a second container and empties contents of the first container to the second container.
[0046] In an embodiment, the second container is associated with a first GPS unit to enable the UAV to monitor real-time location of the second container.
[0047] In another aspect, an unmanned aerial vehicle (UAV) is disclosed. The UAV ca be used for picking up one or more objects from an area of interest, the UAV comprising: a processing unit configured on the UAV and operatively coupled to it, the processing unit configured to: receive one or more first images of an area of interest; detect one or more objects to be picked up from the area of interest based on processing of the received one or more first images; and determine location of the one or more objects in the area of interest using a global positioning system (GPS); a GPS module configured on the UAV and operatively coupled with it, to detect current location of the UAV, wherein the UAV is operable to pick up the one or more objects based on a signal received from a control unit configured on the UAV, the control unit configured to: determine a path for the UAV from a current location of the UAV to the determined location of the one or more objects, wherein the current location of the UAV is obtained from a GPS module operatively coupled to the UAV; and capture a plurality of second images at the area of interest, from one or more sensors operatively coupled to it, when the UAV reaches the area of interest, wherein the visuals unit detects the presence of the one or more object at the area of interest based on the processing of the plurality of captured second images, and wherein the UAV deploys a pick-up means to pick up the one or more objects.
[0048] In an embodiment, the pickup means comprises one or more jaws, wherein each of the or more jaws having a first set of sensors, the first set of sensors operatively coupled with a control unit, the control unit configured to determine weight of the one or more objects.
[0049] In an embodiment, the control unit configured to compare the determined first weight and a first pre-defined weight, and based on comparison: if the determined first weight is less than the first pre-defined weight then the UAV deploys the one or more jaws to pick up the one or more objects and put the one or more objects in a first container coupled with the UAV; and if the determined first weight is more than the first pre-defined weight then the control unit generates an alert signal indicating overweight condition.
[0050] FIG. 1 illustrates an exemplary block diagram representation of a system for picking and collecting objects in accordance with an embodiment of the present disclosure. 9
[0051] In an embodiment, a system for picking and collecting objects can include an unmanned aerial vehicle (UAV) 102. The UAV 102 can include plurality of jaws 108-1 and 108-2 (collectively referred to as jaws 108 and individually referred to as jaw 108 hereinafter) for holding one or more objects. The one or more objects can include garbage, waste materials lying on sea/ocean/lake or ground surface.
[0052] In an embodiment, each of the jaw 108 can be coupled with a robotic arm 106-1 and 106-2 (collectively referred to as robotic arms 106 and individually referred to as robotic arm 106 hereinafter). Further, each of the robotic arm 106 can have a set of first sensors 104-1 and 104-2 (collectively referred to as first set of sensors 104 hereinafter). The first set of sensors can include, a spring balance and the like for sensing first weight of the one or more objects being picked by the UAV 102.
[0053] In an embodiment, the UAV 102 can be coupled with a first container 112. The first container 112 can be used for storing the one or more objects being picked using the jays 108. The first container 112 can be coupled with the second set of sensors 110. The second set of sensors 110 can be used to sense a second weight of the one or more objects being stored in the first container 112. In an embodiment, the UAV 102 can include a control unit (not shown). The control unit can be operatively coupled with the first set of sensors 104 and the second set of sensors 112. The control unit can be configured to measure the weight of the one or more objects based on the sensed values of weight of the one or more objects using the first set of sensors 104 and the second set of sensors 110.
[0054] Further, the measured first weight and the second weight can be compared with a first pre-defined value and a second pre-defined value respectively. Now, if the first weight is more that the first pre-defined weight the control unit can generate an alert signal to signify that the weight of the one or more objects is more that capacity of weight that the UAV 102 is adapted to lift else if the first weight is less than the first pre-defined weight the UAV 102 can engage the jaws 108 and the robotic 106 to put the picked up one or more objects into the first container 112.
[0055] In an embodiment, the system can include a processing unit 116. The processing unit can be configured to receive one or more first images from a satellite sensors. The received one or more images can be image of an area of interest. The area of interest can include land and water surface. In an embodiment, the processing unit can be further configured to process the received one or more images to detect the one or more objects. Now, Global Positioning System (GPS) module can be used to determine position of the one 10
or more objects. Now, based on the detected GPS position of the one or more objects the UAV 102 can be adapted to tread towards the detected position of the one or more objects.
[0056] In an embodiment, the processing unit 116 can be configured on the UAV 102. In an embodiment, the UAV 102 can include a GPS and telemetry module 114(interchangeably referred to as navigation unit 114 hereinafter) configured to determine a path for the UAV from a current location of the UAV 102 to the determined location of the one or more objects, wherein the current location of the UAV 102can be obtained from a GPS module operatively coupled to the UAV 102.
[0057] In an embodiment, the UAV 102 can include a visual unit 118 configured to capture a plurality of second images at the area of interest, from one or more sensors operatively coupled to it, when the UAV 102 reaches said area of interest, wherein said visuals unit detects the presence of the one or more object at the area of interest based on the processing of the plurality of captured second images, and wherein the UAV 102 can deploy the robotic arms 106 and the jaws 108 to pick up the one or more objects. The one or more sensors can include a visual sensor, a camera and the like. Further, the UAV 102 can deploy the robotic arms 106 and the jaws 108 to pick up the one or more objects.
[0058] FIG. 2A illustrates an exemplary representation of UAV picking objects in accordance with an embodiment of the present disclosure.
[0059] In an embodiment, an unmanned aerial vehicle (UAV) 202 can include a processing unit (not shown) configured on the UAV and operatively coupled to it, the processing unit configured to: receive one or more first images of an area of interest; detect one or more objects 204 to be picked up from the area of interest based on processing of the received one or more first images; and determine location of the one or more objects in the area of interest using a global positioning system (GPS).
[0060] In an embodiment, the UAV 202 can include a GPS module (not shown) configured on the UAV 202 and operatively coupled with it, to detect current location of the UAV 202, wherein the UAV 202 can be operable to pick up the one or more objects 204 based on a signal received from a control unit configured on the UAV 202 The control unit can be configured to: determine a path for the UAV 202 from a current location of the UAV 202 to the determined location of the one or more objects 204, wherein the current location of the UAV 202 can be obtained from a GPS module operatively coupled to the UAV 202; and capture a plurality of second images at the area of interest, from one or more sensors operatively coupled to it, when the UAV 202 reaches the area of interest, wherein the visuals unit detects the presence of the one or more object at the area of interest based on the 11
processing of the plurality of captured second images, and wherein the UAV 202 deploys the robotic arms 106 and the jaws 108 to pick up the one or more objects 204.
[0061] FIG. 2B illustrates an exemplary representation of collecting of objects in accordance with an embodiment of the present disclosure.
[0062] In an embodiment, if the second weight is more than the second pre-defined weight the UAV 202 can be adapted to navigate to a second container 206 and empties contents of the first container to the second container 206. The second container 206 can also include garbage disposal unit that can be adapted to accept the one or more objects 204 such as garbage and waste materials for processing. In an embodiment, the second container 206 can also be implemented on a vehicle and the like for transportation.
[0063] In an embodiment, the second container 206 can be associated with a first GPS unit (not shown). The GPS unit can be configured to detect in real-time position of the second container 206 that can be used to provide real-time location to the UAV 202.
[0064] Thus, it will be appreciated by those of ordinary skill in the art that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named.
[0065] While embodiments of the present invention have been illustrated and described, it will be clear that the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the invention, as described in the claim.
[0066] In the foregoing description, numerous details are set forth. It will be apparent, however, to one of ordinary skill in the art having the benefit of this disclosure, that the present invention may be practiced without these specific details. In some instances, well-
12
known structures and devices are shown in block diagram form, rather than in detail, to avoid obscuring the present invention.
[0067] As used herein, and unless the context dictates otherwise, the term "coupled to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other)and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously. Within the context of this document terms "coupled to" and "coupled with" are also used euphemistically to mean “communicatively coupled with” over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.
[0068] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C …. and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc.
[0069] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0070] The present disclosure provides an unmanned aerial vehicle and system for picking and collecting objects. 13
[0071] The present disclosure provides an unmanned aerial vehicle and system for picking and collecting objects that doesn’t require human intervention for functioning of the system
[0072] The present disclosure provides an unmanned aerial vehicle and system for picking and collecting objects that can enable collecting of objects laying at unreachable location.
[0073] The present disclosure provides an unmanned aerial vehicle and system for picking and collecting objects that can be used for picking and collecting garbage from land as well as water surface.
[0074] The present disclosure provides an unmanned aerial vehicle and system for picking and collecting objects that is time efficient and robust.
[0075] The present disclosure provides an unmanned aerial vehicle and system for picking and collecting objects that is cost efficient and easy to implement.
We Claim:
1. A system for picking objects from an area of interest, said system comprising:
a processing unit configured to:
receive one or more first images of an area of interest;
detect one or more objects to be picked up from the area of interest based on processing of the received one or more first images; and
determine location of the one or more objects in the area of interest using a global positioning system (GPS);
an unmanned aerial vehicle (UAV) operatively coupled to the processing unit, said UAV comprising:
a navigation unit configured to determine a path for the UAV from a current location of the UAV to the determined location of the one or more objects, wherein the current location of the UAV is obtained from a GPS module operatively coupled to the UAV; and
a visual unit configured to capture a plurality of second images at the area of interest, from one or more sensors operatively coupled to it, when the UAV reaches said area of interest, wherein said visuals unit detects the presence of the one or more object at the area of interest based on the processing of the plurality of captured second images, and
wherein the UAV deploys a pick-up means to pick up the one or more objects.
2. The system as claimed in claim 1, wherein the pickup means comprises one or more jawsfor picking up the one or more objects, and wherein each of the one or more jaws having a first set of sensors, the first set of sensors operatively coupled with a control unit, the control unit configured to determine first weight of the one or more objects.
3. The system as claimed in claim 2, wherein the control unit configured to compare the determined first weight and a first pre-defined weight, and based on comparison if the determined first weight is less than the first pre-defined weight then the UAV deploys the one or more jaws to pick up the one or more objects and put the one or more objects in a first container coupled with the UAV.
4. The system as claimed in claim 2, wherein based on the comparison if the determined first weight is more than the first pre-defined weight then the control unit generates an alert signal indicating overweight condition. 15
5. The system as claimed in claim 2, wherein the system comprises a second set of sensors coupled to the first container for determining second weight of the first container, and wherein the control unit operatively coupled with the second set of sensors configured to compare the determined second weight with a second pre-defined weight.
6. The system as claimed in claim 5, wherein if the second weight is more than the second pre-defined weight the UAV navigates to a second container and empties contents of the first container to the second container.
7. The system as claimed in claim 5, wherein the second container is associated with a first GPS unit to enable the UAV to monitor real-time location of the second container.
8. An unmanned aerial vehicle (UAV) for picking up one or more objects from an area of interest, said UAV comprising:
a processing unit configured on the UAV and operatively coupled to it, said processing unit configured to:
receive one or more first images of an area of interest;
detect one or more objects to be picked up from the area of interest based on processing of the received one or more first images; and
determine location of the one or more objects in the area of interest using a global positioning system (GPS);
a GPS module configured on the UAV and operatively coupled with it, to detect current location of the UAV, wherein the UAV is operable to pick up the one or more objects based on a signal received from a control unit configured on the UAV, said control unit configured to:
determine a path for the UAV from a current location of the UAV to the determined location of the one or more objects, wherein the current location of the UAV is obtained from a GPS module operatively coupled to the UAV; and
capture a plurality of second images at the area of interest, from one or more sensors operatively coupled to it, when the UAV reaches said area of interest, wherein said visuals unit detects the presence of the one or more object at the area of interest based on the processing of the plurality of captured second images, and
wherein the UAV deploys a pick-up means to pick up the one or more objects.
9. The UAV as claimed in claim 8, wherein the pickup means comprises one or more jaws, wherein each of the or more jaws having a first set of sensors, the first set of sensors 16
operatively coupled with a control unit, the control unit configured to determine weight of the one or more objects.
10. The UAV as claimed in claim 9, wherein the control unit configured to compare the determined first weight and a first pre-defined weight, and based on comparison:
if the determined first weight is less than the first pre-defined weight then the UAV deploys the one or more jaws to pick up the one or more objects and put the one or more objects in a first container coupled with the UAV; and
if the determined first weight is more than the first pre-defined weight then the control unit generates an alert signal indicating overweight condition.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201911010042-IntimationOfGrant24-06-2024.pdf | 2024-06-24 |
| 1 | 201911010042-STATEMENT OF UNDERTAKING (FORM 3) [14-03-2019(online)].pdf | 2019-03-14 |
| 2 | 201911010042-FORM FOR STARTUP [14-03-2019(online)].pdf | 2019-03-14 |
| 2 | 201911010042-PatentCertificate24-06-2024.pdf | 2024-06-24 |
| 3 | 201911010042-FORM FOR SMALL ENTITY(FORM-28) [14-03-2019(online)].pdf | 2019-03-14 |
| 3 | 201911010042-Annexure [22-12-2023(online)].pdf | 2023-12-22 |
| 4 | 201911010042-Written submissions and relevant documents [22-12-2023(online)].pdf | 2023-12-22 |
| 4 | 201911010042-FORM 1 [14-03-2019(online)].pdf | 2019-03-14 |
| 5 | 201911010042-FORM-26 [06-12-2023(online)].pdf | 2023-12-06 |
| 5 | 201911010042-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [14-03-2019(online)].pdf | 2019-03-14 |
| 6 | 201911010042-EVIDENCE FOR REGISTRATION UNDER SSI [14-03-2019(online)].pdf | 2019-03-14 |
| 6 | 201911010042-Correspondence to notify the Controller [04-12-2023(online)].pdf | 2023-12-04 |
| 7 | 201911010042-US(14)-HearingNotice-(HearingDate-07-12-2023).pdf | 2023-10-25 |
| 7 | 201911010042-DRAWINGS [14-03-2019(online)].pdf | 2019-03-14 |
| 8 | 201911010042-DECLARATION OF INVENTORSHIP (FORM 5) [14-03-2019(online)].pdf | 2019-03-14 |
| 8 | 201911010042-ABSTRACT [30-07-2022(online)].pdf | 2022-07-30 |
| 9 | 201911010042-CLAIMS [30-07-2022(online)].pdf | 2022-07-30 |
| 9 | 201911010042-COMPLETE SPECIFICATION [14-03-2019(online)].pdf | 2019-03-14 |
| 10 | 201911010042-COMPLETE SPECIFICATION [30-07-2022(online)].pdf | 2022-07-30 |
| 10 | abstract.jpg | 2019-04-22 |
| 11 | 201911010042-CORRESPONDENCE [30-07-2022(online)].pdf | 2022-07-30 |
| 11 | 201911010042-FORM-26 [17-05-2019(online)].pdf | 2019-05-17 |
| 12 | 201911010042-DRAWING [30-07-2022(online)].pdf | 2022-07-30 |
| 12 | 201911010042-Power of Attorney-200519.pdf | 2019-05-27 |
| 13 | 201911010042-Correspondence-200519.pdf | 2019-05-27 |
| 13 | 201911010042-FER_SER_REPLY [30-07-2022(online)].pdf | 2022-07-30 |
| 14 | 201911010042-FORM-26 [30-07-2022(online)].pdf | 2022-07-30 |
| 14 | 201911010042-Proof of Right (MANDATORY) [16-07-2019(online)].pdf | 2019-07-16 |
| 15 | 201911010042-FER.pdf | 2022-01-31 |
| 15 | 201911010042-OTHERS-180719.pdf | 2019-07-26 |
| 16 | 201911010042-Correspondence-180719.pdf | 2019-07-26 |
| 16 | 201911010042-FORM 18 [09-12-2020(online)].pdf | 2020-12-09 |
| 17 | 201911010042-FORM 18 [09-12-2020(online)].pdf | 2020-12-09 |
| 17 | 201911010042-Correspondence-180719.pdf | 2019-07-26 |
| 18 | 201911010042-FER.pdf | 2022-01-31 |
| 18 | 201911010042-OTHERS-180719.pdf | 2019-07-26 |
| 19 | 201911010042-FORM-26 [30-07-2022(online)].pdf | 2022-07-30 |
| 19 | 201911010042-Proof of Right (MANDATORY) [16-07-2019(online)].pdf | 2019-07-16 |
| 20 | 201911010042-Correspondence-200519.pdf | 2019-05-27 |
| 20 | 201911010042-FER_SER_REPLY [30-07-2022(online)].pdf | 2022-07-30 |
| 21 | 201911010042-DRAWING [30-07-2022(online)].pdf | 2022-07-30 |
| 21 | 201911010042-Power of Attorney-200519.pdf | 2019-05-27 |
| 22 | 201911010042-CORRESPONDENCE [30-07-2022(online)].pdf | 2022-07-30 |
| 22 | 201911010042-FORM-26 [17-05-2019(online)].pdf | 2019-05-17 |
| 23 | 201911010042-COMPLETE SPECIFICATION [30-07-2022(online)].pdf | 2022-07-30 |
| 23 | abstract.jpg | 2019-04-22 |
| 24 | 201911010042-COMPLETE SPECIFICATION [14-03-2019(online)].pdf | 2019-03-14 |
| 24 | 201911010042-CLAIMS [30-07-2022(online)].pdf | 2022-07-30 |
| 25 | 201911010042-DECLARATION OF INVENTORSHIP (FORM 5) [14-03-2019(online)].pdf | 2019-03-14 |
| 25 | 201911010042-ABSTRACT [30-07-2022(online)].pdf | 2022-07-30 |
| 26 | 201911010042-US(14)-HearingNotice-(HearingDate-07-12-2023).pdf | 2023-10-25 |
| 26 | 201911010042-DRAWINGS [14-03-2019(online)].pdf | 2019-03-14 |
| 27 | 201911010042-EVIDENCE FOR REGISTRATION UNDER SSI [14-03-2019(online)].pdf | 2019-03-14 |
| 27 | 201911010042-Correspondence to notify the Controller [04-12-2023(online)].pdf | 2023-12-04 |
| 28 | 201911010042-FORM-26 [06-12-2023(online)].pdf | 2023-12-06 |
| 28 | 201911010042-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [14-03-2019(online)].pdf | 2019-03-14 |
| 29 | 201911010042-Written submissions and relevant documents [22-12-2023(online)].pdf | 2023-12-22 |
| 29 | 201911010042-FORM 1 [14-03-2019(online)].pdf | 2019-03-14 |
| 30 | 201911010042-FORM FOR SMALL ENTITY(FORM-28) [14-03-2019(online)].pdf | 2019-03-14 |
| 30 | 201911010042-Annexure [22-12-2023(online)].pdf | 2023-12-22 |
| 31 | 201911010042-FORM FOR STARTUP [14-03-2019(online)].pdf | 2019-03-14 |
| 31 | 201911010042-PatentCertificate24-06-2024.pdf | 2024-06-24 |
| 32 | 201911010042-IntimationOfGrant24-06-2024.pdf | 2024-06-24 |
| 32 | 201911010042-STATEMENT OF UNDERTAKING (FORM 3) [14-03-2019(online)].pdf | 2019-03-14 |
| 1 | SEARCHE_31-01-2022.pdf |