Sign In to Follow Application
View All Documents & Correspondence

A System For Detection And Collection Of Pollutant Objects And Method Thereof

Abstract: ABSTRACT A SYSTEM FOR DETECTION AND COLLECTION OF POLLUTANT OBJECTS AND METHOD THEREOF The present disclosure relates to the field of detection and collection of pollutant objects. The envisaged system (100) and method eliminates the need of human intervention for collection of pollutant objects, and is also effective and efficient. The system (100) comprises an unmanned aerial vehicle (UAV) (102), a server (120), and a collecting unit (130). The UAV (102) is configured to periodically capture the images of vicinity while maneuvering and determine location and presence of pollutant objects. The server (120) is configured to cooperate with the UAV (102) to generate a navigable path for the collecting unit (130) based on the location of the pollutant objects. Further, the collecting unit (130) is configured to navigate along the navigable path to collect the pollutant objects from the locations determined by the UAV (102).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
21 March 2018
Publication Number
29/2020
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
ipo@knspartners.com
Parent Application

Applicants

ZENSAR TECHNOLOGIES LIMITED
ZENSAR KNOWLEDGE PARK, PLOT # 4, MIDC, KHARADI, OFF NAGAR ROAD, PUNE-411014, MAHARASHTRA, INDIA

Inventors

1. KUMAR Anand Yashwanth
505, V Building "Jade Residences" Wagholi, Pune-412207, Maharashtra, India
2. NAMBIAR Ullas Balan
1086 Prestige Kensington Gardens, Bangalore-560013,l Karnataka, India

Specification

DESC:FIELD
The present disclosure relates to a system and method for detection and collection of pollutant objects in an area/region.
BACKGROUND
Accumulation of garbage is a major problem as it may include objects such as non-biodegradable substances/objects, which are a major source of pollution. The pollutant objects, like synthetic plastics, affect the ecosystem because they are resistant to environmental degradation. Further, improper disposal of these pollutant objects has a negative impact on our natural environment.
Conventionally, these pollutant objects are collected manually on a daily basis. However, many times, laborers are not able to detect the pollutant objects such as non-biodegradable waste materials, different form of synthetic plastics, and the like. Hence, these pollutant objects go unnoticed by laborers, thus resulting in the persistence of the pollutant objects in the environment, which is harmful to plants, wildlife and human population. Therefore, detection and collection of the pollutant objects by the laborers are found to be ineffective and inefficient. Further, manual collection does not ensure that the pollutant objects are actually being collected by the laborers for disposal. In certain situations, the pollutant objects are very toxic to human health, hence, collection of such pollutant objects using laborers is not advisable.
There is, therefore, felt a need for an automated system for detection and collection of pollutant objects that alleviates the aforementioned drawbacks.
OBJECTS
Some of the objects of the present disclosure, which at least one embodiment herein satisfies, are as follows.
An object of the present disclosure is to provide a system and method for detection and collection of pollutant objects.
Another object of the present disclosure is to provide a system and method for detection and collection of pollutant objects which does not require human intervention.
Still another object of the present disclosure is to provide a system and method for detection and collection of pollutant objects which is effective and efficient.
Still another object of the present disclosure is to provide a system and method for detection and collection of pollutant objects which has a quick turnaround time.
Other objects and advantages of the present disclosure will be more apparent from the following description, which is not intended to limit the scope of the present disclosure.
SUMMARY
The present disclosure envisages a system for detection and collection of pollutant objects. The system comprises an unmanned aerial vehicle, a server, and a collecting unit. In an embodiment, the collecting unit is robotic driven.
The unmanned aerial vehicle (UAV) includes an image capturing device, a location identification unit, a segregating means and a transceiver.
The image capturing device is configured to periodically capture the images of vicinity while maneuvering. The location identification unit is configured to cooperate with the image capturing device, and is further configured to identify the location co-ordinates of the unmanned aerial vehicle upon receiving each of the captured images. Further, the location identification unit is configured to tag the identified location co-ordinates of the unmanned aerial vehicle with the captured image.
In an embodiment, the location identification unit comprises a location identifier and a tagging unit. The location identifier configured to identify the location co-ordinates of the unmanned aerial vehicle (UAV) upon receiving each of the captured images. The tagging unit is configured to cooperate with the location identifier to receive the location co-ordinates, and is further configured to tag the location co-ordinates with the captured image, wherein the tagging unit is implemented using one or more processor(s).
The segregating means is configured to cooperate with the location identification unit. The segregating means is further configured to segregate tagged images by selecting images having the pollutant objects and discarding images not having pollutant objects. The transceiver is configured to cooperate with the segregating means to receive and transmit the selected tagged images.
In an embodiment, the segregating means includes a memory, an image processing unit, and a separator. The memory is configured to store a pre-determined set of processing rules. The image processing unit is configured to cooperate with the memory, and is further configured to perform real time image processing on the tagged image to identify the presence of pollutant objects in the tagged image based on the pre-determined segregating rules. The separator is configured to cooperate with the image processing unit to select the tagged images having pollutant objects and discard the tagged images not having pollutant objects. The image processing unit and the separator are implemented using one or more processor(s).
Further, the server is configured to communicate with the UAV to receive the selected tagged images, and is further configured to extract the pollutant object details and corresponding location co-ordinates from each of the selected tagged images. Subsequently, the server is configured to generate a navigable path based on the extracted location co-ordinates.
In an embodiment, the server includes a first communication means, an extractor, a pollutant object identifier, a database, and a navigation unit.
The first communication module is configured to receive the selected tagged images from the unmanned aerial vehicle. The extractor is configured to cooperate with the first communication module to receive the tagged images, and is further configured to extract the location co-ordinates from each of the tagged images. The pollutant object identifier is configured to cooperate with the first communication module to receive the tagged images, and is further configured to determine the pollutant object details from each of the tagged images, wherein said pollutant object identifier is implemented using one or more processor(s).
The database is configured to receive the location co-ordinates and pollutant object details from the extractor and the pollutant object identifier for each of the tagged images. Further, the database is configured to store a list of location co-ordinates and pollutant object details corresponding to each of the location co-ordinates in a lookup table.
The navigation unit is configured to cooperate with the database and the first communication module. The navigation unit is configured to fetch the list of location coordinates from the database, and is further configured to generate the navigable path based on the list of location coordinates. The first communication module is configured to transmit the navigable path, the location co-ordinates, and the pollutant object details corresponding to each of the location co-ordinates to the collecting unit.
The collecting unit is configured to communicate with the server to receive the navigable path, the pollutant object details, and the location co-ordinates corresponding to the pollutant object details. The collecting unit is further configured to navigate along the navigable path to collect the pollutant objects based on the location co-ordinates and the pollutant object details.
In an embodiment, the collecting unit comprises a second communication module, a control unit, and an actuation unit.
The second communication module is configured to receive the navigable path, the location co-ordinates, and the pollutant object details corresponding to the location co-ordinates. The control unit is configured to cooperate with the second communication module to generate an actuation signal based on the navigable path and location co-ordinates of the pollutant object details. The actuation unit is configured to cooperate with the control unit to receive the actuation signal, and is further configured to provide mechanical drive to navigate the collecting unit towards a nearest location co-ordinate along the navigable path, thereby facilitating the collecting unit to collect the pollutant objects.
In another embodiment, the collecting unit includes a positioning unit, a computation unit, and a trash collector.
The positioning unit is configured to detect the present location of the collecting unit. The computation unit is configured to cooperate with the positioning unit to receive the present location co-ordinates of the collecting unit, and is further configured to generate an activation signal when the present location co-ordinates of the collecting unit matches with at least one location co-ordinates corresponding to the pollutant object details. The trash collector is configured to receive the activation signal from the computation unit, and is further configured to collect the pollutant objects from the location co-ordinate. Further, the trash collector is configured to generate a collection signal subsequent to the collection of the pollutant objects from the location co-ordinates.
In an embodiment, the server is configured to generate a verification command subsequent to collection of the pollutant objects from each of the location co-ordinates, and is further configured to transmit the verification command for instructing said unmanned aerial vehicle (UAV) to provide tagged images of said location co-ordinates, thereby enabling said server to verify the collection of pollutant objects from each of the location co-ordinates.
In one embodiment, the trash collector includes a vacuum cleaner for collection of pollutant objects.
The present disclosure also envisages a method for detection and collection of pollutant objects.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWING
A system and method for detection and collection of pollutant objects of the present disclosure will now be described with the help of the accompanying drawing, in which:
FIGURE 1 illustrates a block diagram of a system for detection and collection of pollutant objects;
FIGURE 2a and Figure 2b illustrate a flow chart of a method for detection and collection of pollutant objects.
LIST OF REFERENCE NUMERALS
100 – System
102 – Unmanned aerial vehicle (UAV)
104 – Image capturing device
106 – Location identification unit
108 – Segregating means
110 – Transceiver
112 – Location identifier
114 – Tagging unit
116 – Memory
118 – Image processing unit
119 – Separator
120 – Server
122 – First communication module
124 – Extractor
126 – Pollutant object identifier
128 – Database
129 – Navigation unit
130 – Collecting unit
132 – Second communication module
134 – Control unit
136 – Actuation unit
138 – Positioning unit
140 – Computation unit
142 – Trash collector
DETAILED DESCRIPTION
Embodiments, of the present disclosure, will now be described with reference to the accompanying drawing.
Embodiments are provided so as to thoroughly and fully convey the scope of the present disclosure to the person skilled in the art. Numerous details are set forth, relating to specific components, and methods, to provide a complete understanding of embodiments of the present disclosure. It will be apparent to the person skilled in the art that the details provided in the embodiments should not be construed to limit the scope of the present disclosure. In some embodiments, well-known processes, well-known apparatus structures, and well-known techniques are not described in detail.
The terminology used, in the present disclosure, is only for the purpose of explaining a particular embodiment and such terminology shall not be considered to limit the scope of the present disclosure. As used in the present disclosure, the forms "a,” "an," and "the" may be intended to include the plural forms as well, unless the context clearly suggests otherwise. The terms "comprises," "comprising," “including,” and “having,” are open ended transitional phrases and therefore specify the presence of stated features, steps, operations, units and/or components, but do not forbid the presence or addition of one or more other features, steps, operations, components, and/or groups thereof. The particular order of steps disclosed in the method and process of the present disclosure is not to be construed as necessarily requiring their performance as described or illustrated. It is also to be understood that additional or alternative steps may be employed.
The terms first, second, third, etc., should not be construed to limit the scope of the present disclosure as the aforementioned terms may be only used to distinguish one element, component, region, layer or section from another component, region, layer or section. Terms such as first, second, third, etc., when used herein do not imply a specific sequence or order unless clearly suggested by the present disclosure.
The computer implemented system and method for delivering electronic prescription and medicines of the present disclosure is described with reference to Figure 1 through Figure 2b.
Referring to Figure 1, the system (100) comprises an unmanned aerial vehicle (UAV) (102), a server (120), and a collecting unit (130). In an embodiment, the collecting unit (130) is robotic driven. In another embodiment, the collecting unit (130) is an automated guided vehicle or an automatic guided vehicle (AGV) which follows markers, wires on the floor, or uses vision or lasers.
In an embodiment, the unmanned aerial vehicle (UAV) (102) is a drone.
The unmanned aerial vehicle (UAV) (102) includes an image capturing device (104), a location identification unit (106), a segregating means (108) and a transceiver (110).
The image capturing device (104) is configured to periodically capture the images of vicinity while maneuvering. The location identification unit (106) is configured to cooperate with the image capturing device (104), and is further configured to identify the location co-ordinates of the unmanned aerial vehicle (UAV) (102) upon receiving each of the captured images. Further, the location identification unit (106) is configured to tag the identified location co-ordinates of the unmanned aerial vehicle (UAV) (102) with the captured image.
In an embodiment, the location identification unit (106) comprises a location identifier (112) and a tagging unit (114). The location identifier (112) is configured to identify the location co-ordinates of the unmanned aerial vehicle (UAV) (102) upon receiving each of the captured images. The tagging unit (114) is configured to cooperate with the location identifier (112) to receive the location co-ordinates, and is further configured to tag the location co-ordinates with the captured image, wherein the tagging unit (114) is implemented using one or more processor(s).
The segregating means (108) is configured to cooperate with the location identification unit (106). The segregating means (108) is further configured to segregate tagged images by selecting images having the pollutant objects and discarding images not having pollutant objects.
In an embodiment, the segregating means (108) includes a memory (116), an image processing unit (118), and a separator (119). The memory (116) is configured to store a pre-determined set of processing rules. In an embodiment, the memory (116) is configured to store types of pollutant objects and features related to the pollutant objects. The image processing unit (118) is configured to cooperate with the memory (116), and is further configured to perform real time image processing on the tagged image to identify the presence of pollutant objects in the tagged image based on the pre-determined segregating rules. The separator (119) is configured to cooperate with the image processing unit (118) to select the tagged images having pollutant objects and discard the tagged images not having pollutant objects. In an embodiment, the image processing unit (118) is configured to extract pollutant object details from the captured image.
The image processing unit (118) and the separator (119) are implemented using one or more processor(s).
The transceiver (110) is configured to cooperate with the segregating means (108) to receive and transmit the selected tagged images.
Further, the server (120) is configured to communicate with the unmanned aerial vehicle (UAV) (102) to receive the selected tagged images, and is further configured to extract the pollutant object details and corresponding location co-ordinates from each of the selected tagged images. Subsequently, the server (120) is configured to generate a navigable path based on the extracted location co-ordinates.
In an embodiment, the server (120) includes a first communication module (122), an extractor (124), a pollutant object identifier (126), a database (128), and a navigation unit (129).
The first communication module (122) is configured to receive the selected tagged images from the unmanned aerial vehicle. The extractor (124) is configured to cooperate with the first communication module (122) to receive the tagged images, and is further configured to extract the location co-ordinates from each of the tagged images. The pollutant object identifier (126) is configured to cooperate with the first communication module (122) to receive the tagged images, and is further configured to determine the pollutant object details from each of the tagged images, wherein the pollutant object identifier (126) is implemented using one or more processor(s). In an embodiment, the pollutant object identifier (126) is configured to extract features related to the objects within the tagged images by employing feature extraction techniques. In an embodiment, the pollutant object identifier (126) is configured to employ Artificial Intelligence (AI) based object detection techniques and machine learning techniques to determine pollutant objects details.
The database (128) is configured to receive the location co-ordinates and pollutant object details from the extractor (124) and the pollutant object identifier (126) for each of the tagged images. Further, the database (128) is configured to store a list of location co-ordinates and pollutant object details corresponding to each of the location co-ordinates in a lookup table.
The navigation unit (129) is configured to cooperate with the database (128) and the first communication module (122). The navigation unit (129) is configured to fetch the list of location coordinates from the database (128), and is further configured to generate the navigable path based on the list of location coordinates. In an embodiment, the navigation unit (129) employs pre-determined navigation techniques to generate an optimized and efficient navigation path. The first communication module (122) is configured to transmit the navigable path, the location co-ordinates, and the pollutant object details corresponding to each of the location co-ordinates to the collecting unit (130).
The collecting unit (130) is configured to communicate with the server (120) to receive the navigable path, the pollutant object details, and the location co-ordinates corresponding to the pollutant object details. The collecting unit (130) is further configured to navigate along the navigable path to collect the pollutant objects based on the location co-ordinates and the pollutant object details.
In an embodiment, the collecting unit (130) comprises a second communication module (132), a control unit (134), and an actuation unit (136).
The second communication module (132) is configured to receive the navigable path, the location co-ordinates, and the pollutant object details corresponding to the location co-ordinates. The control unit (134) is configured to cooperate with the second communication module (132) to generate an actuation signal based on the navigable path and location co-ordinates of the pollutant object details. The actuation unit (136) is configured to cooperate with the control unit (134) to receive the actuation signal, and is further configured to provide mechanical drive to navigate the collecting unit (130) towards a nearest location co-ordinate along the navigable path, thereby facilitating the collecting unit (130) to collect the pollutant objects.
In another embodiment, the collecting unit (130) includes a positioning unit (138), a computation unit (140), and a trash collector (142).
The positioning unit (138) is configured to detect the present location of the collecting unit (130). The computation unit (140) is configured to cooperate with the positioning unit (138) to receive the present location co-ordinates of the collecting unit (130), and is further configured to generate an activation signal when the present location co-ordinates of the collecting unit (130) match with at least one location co-ordinates corresponding to the pollutant object details. The trash collector (142) is configured to receive the activation signal from the computation unit (140), and is further configured to collect the pollutant objects from the location co-ordinate. Further, the trash collector (142) is configured to generate a collection signal subsequent to the collection of the pollutant objects from the location co-ordinates.
In an embodiment, the server (120) is configured to generate a verification command subsequent to the collection of the pollutant objects from each of the location coordinates. The generated verification command is send to the unmanned aerial vehicle (UAV) (102) by the server (120) for instructing the unmanned aerial vehicle (UAV) (102) to maneuver along the navigable path to verify the collection of pollutant objects from each of the location co-ordinates.
More specifically, upon receiving the verification command signal, the unmanned aerial vehicle (UAV) (102) is configured to activate the image capturing device (104) and confirm, in real time, whether the pollutant objects has been collected or not by the collecting unit (130). In an event, if the pollutant objects are still present at the location co-ordinates then the unmanned aerial vehicle (UAV) (102) will send the selected tagged image which is nothing but image having pollutant object and the location co-ordinate. In an alternate event, if no tagged image is transmitted by the unmanned aerial vehicle (UAV) (102) to the server (120), then the server (120) is configured to determine that the collecting unit (130) has successfully collected the pollutant object.
In one embodiment, the trash collector (142) includes a vacuum cleaner for collection of pollutant objects.
Typically, the processor is implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any device that manipulates signals based on operational instructions.
The present disclosure also envisages a method for detection and collection of pollutant objects.
Referring to Figure 2a and 2b, the steps for detection and collection of pollutant objects include:
• Step 202: capturing at least one image of vicinity, by an image capturing device (104) of an unmanned aerial vehicle, while maneuvering;
• Step 204: identifying location co-ordinates, by a location identification unit (106) of the unmanned aerial vehicle, upon capturing the image;
• Step 205: identifying location co-ordinates, by a location identification unit (106) of the unmanned aerial vehicle, upon capturing the image;
• Step 206: tagging, by the location identification unit (106), the identified location co-ordinates with the captured image;
• Step 208: receiving and segregating, by a segregating means (108) of the unmanned aerial vehicle, the tagged images by selecting images having the pollutant objects and discarding images not having pollutant objects;
• Step 210: communicating with the unmanned aerial vehicle, by a server (120), to receive the selected tagged images;
• Step 212: extracting, by the server (120), the pollutant object details and corresponding location co-ordinates from each of the tagged images;
• Step 214: generating navigable path, by the server (120), based on the extracted location co-ordinates;
• Step 216: communicating with the server (120), by a collecting unit (130), to receive the navigable path, the pollutant object details, and location co-ordinates corresponding to the pollutant object details; and
• Step 218: navigating along the navigable path, by the collecting unit (130), to collect the pollutant objects based on the location co-ordinates and the pollutant object details.
The foregoing description of the embodiments has been provided for purposes of illustration and not intended to limit the scope of the present disclosure. Individual components of a particular embodiment are generally not limited to that particular embodiment, but, are interchangeable. Such variations are not to be regarded as a departure from the present disclosure, and all such modifications are considered to be within the scope of the present disclosure.
TECHNICAL ADVANCEMENTS
The present disclosure described herein above has several technical advantages including, but not limited to, the computer implemented system and method for detection and collection of pollutant objects which:
• does not require human intervention;
• is effective and efficient; and
• has a quick turnaround time.
The foregoing disclosure has been described with reference to the accompanying embodiment which does not limit the scope and ambit of the disclosure. The description provided is purely by way of example and illustration.
The embodiments herein and the various features and advantageous details thereof are explained with reference to the non-limiting embodiments in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein.
The foregoing description of the specific embodiments so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
Throughout this specification the word “comprise”, or variations such as “comprises” or “comprising”, will be understood to imply the inclusion of a stated step, or group of steps, but not the exclusion of any other element, step, or group of steps.
While considerable emphasis has been placed herein on the components and component parts of the preferred embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the disclosure. These and other changes in the preferred embodiment as well as other embodiments of the disclosure will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the disclosure and not as a limitation.
,CLAIMS:WE CLAIM:
1. A system (100) for detection and collection of pollutant objects, said system (100) comprising:
• an unmanned aerial vehicle (UAV) (102) having:
o an image capturing device (104) configured to periodically capture images of vicinity while maneuvering;
o a location identification unit (106) configured to cooperate with said image capturing device (104), and further configured to identify location co-ordinates of said UAV (102) upon receiving said captured images, said location identification unit (106) configured to tag the co-ordinates, of the identified location, with said captured images;
o a segregating means (108) configured to cooperate with said location identification unit (106), and further configured to segregate tagged images by selecting images having said pollutant objects and discarding images not having pollutant objects; and
o a transceiver (110) configured to cooperate with said segregating means (108) to receive and transmit said selected tagged images;
• a server (120) configured to communicate with said unmanned aerial vehicle UAV (102) to receive said selected tagged images, and further configured to extract the pollutant object details and corresponding location co-ordinates from each of said tagged images, said server (120) configured to generate a navigable path based on said extracted location co-ordinates; and
• a collecting unit (130) configured to communicate with said server (120) to receive said navigable path, said pollutant object details, and location co-ordinates corresponding to said pollutant object details, said collecting unit (130) further configured to navigate along said navigable path to collect said pollutant objects based on said location co-ordinates and said pollutant object details.
2. The system (100) as claimed in claim 1, wherein said location identification unit (106) comprises:
a. a location identifier (112) configured to identify the location co-ordinates of said unmanned aerial vehicle (UAV) (102) upon receiving each of said captured images; and
b. a tagging unit (114) configured to cooperate with said location identifier (112) to receive the location co-ordinates, and further configured to tag the location co-ordinates with each of said captured images, wherein said tagging unit (114) is implemented using one or more processor(s).
3. The system (100) as claimed in claim 1, wherein said segregating means (108) includes:
a. a memory (116) configured to store a pre-determined set of processing rules;
b. an image processing unit (118) configured to cooperate with said memory (116), and further configured to perform real time image processing on said tagged image to identify the presence of pollutant objects in said tagged image based on said pre-determined segregating rules; and
c. a separator (119) configured to cooperate with said image processing unit (118) to select tagged images having said pollutant objects and discard tagged images not having pollutant objects,
wherein said image processing unit (118) and said separator (119) are implemented using one or more processor(s).
4. The system (100) as claimed in claim 1, wherein said collecting unit (130) is robotic driven.
5. The system (100) as claimed in claim 1, wherein said server (120) includes;
a. a first communication module (122) configured to receive said selected tagged images from said unmanned aerial vehicle (UAV) (102);
b. an extractor (124) configured to cooperate with said first communication module (122) to receive said tagged images, and further configured to extract the location co-ordinates from each of said tagged images;
c. a pollutant object identifier (126) configured to cooperate with said first communication module (122) to receive said tagged images, and further configured to determine pollutant object details from each of said tagged images, wherein said pollutant object identifier (126) is implemented using one or more processor(s);
d. a database (128) configured to receive the location co-ordinates and pollutant object details from said extractor (124) and said pollutant object identifier (126) for each of said tagged images, and further configured to store a list of location co-ordinates and pollutant object details corresponding to each of said location co-ordinates in a lookup table; and
e. a navigation unit (129) configured to cooperate with said database (128) and said first communication module (122), said navigation unit (129) configured to fetch said list of location coordinates from said database (128), and further configured to generate said navigable path based on said list of location coordinates,
wherein said first communication module (122) is configured to transmit said navigable path, said location co-ordinates, and said pollutant object details corresponding to each of said location co-ordinates to said collecting unit (130).
6. The system (100) as claimed in claim 1, wherein said collecting unit (130) comprises:
• a second communication module (132) configured to receive said navigable path, said location co-ordinates, and pollutant object details corresponding to said location co-ordinates;
• a control unit (134) configured to cooperate with said second communication module (132) to generate an actuation signal based on said navigable path and location co-ordinates of said pollutant object details; and
• an actuation unit (136) configured to cooperate with said control unit (134) to receive said actuation signal, and further configured to provide mechanical drive to navigate said collecting unit (130) towards a nearest location co-ordinate along said navigable path, thereby facilitating said collecting unit (130) to collect said pollutant objects.
7. The system (100) as claimed in claim 1, wherein said collecting unit (130) includes:
a. a positioning unit (138) configured to detect the present location of said collecting unit (130);
b. a computation unit (140) configured to cooperate with said positioning unit (138) to receive the present location co-ordinates of said collecting unit (130), and further configured to generate an activation signal when the present location co-ordinates of said collecting unit (130) matches with at least one location co-ordinates corresponding to said pollutant object details; and
c. a trash collector (142) configured to receive said activation signal from said computation unit (140), and further configured to collect said pollutant objects from said location co-ordinates, said trash collector (142) configured to generate collection signal subsequent to collection of said pollutant objects from said location co-ordinates.
8. The system (100) as claimed in claim 1, wherein said server (120) is configured to generate verification command subsequent to collection of said pollutant objects from each of said location co-ordinates, and further configured to transmit said verification command for instructing said unmanned aerial vehicle (UAV) (102) to provide tagged images of said location co-ordinates, thereby enabling said server (120) to verify the collection of pollutant objects from each of said location co-ordinates.
9. The system (100) as claimed in claim 7, wherein said trash collector (142) includes a vacuum cleaner for collection of said pollutant objects.
10. A method for detection and collection of pollutant objects, said method comprising the step of:
a. capturing (202) at least one image of vicinity, by an image capturing device (104) of an unmanned aerial vehicle (UAV) (102), while maneuvering;
b. identifying (204) location co-ordinates, by a location identification unit (106) of said unmanned aerial vehicle (UAV) (102), upon capturing said image;
c. tagging (206), by said location identification unit (106), the identified location co-ordinates with said captured image;
d. receiving and segregating (208), by a segregating means (108) of said unmanned aerial vehicle (UAV) (102), said tagged images by selecting images having said pollutant objects and discarding images not having pollutant objects;
e. communicating (210) with said unmanned aerial vehicle (UAV) (102), by a server (120), to receive said selected tagged images;
f. extracting (212), by said server (120), the pollutant object details and corresponding location co-ordinates from each of said tagged images;
g. generating (214) navigable path, by said server (120), based on said extracted location co-ordinates;
h. communicating (216) with said server (120), by a collecting unit (130), to receive said navigable path, said pollutant object details, and location co-ordinates corresponding to said pollutant object details; and
i. navigating (218), by said collecting unit (130), along said navigable path to collect said pollutant objects based on said location co-ordinates and said pollutant object details.

Documents

Application Documents

# Name Date
1 201821010428-Correspondence to notify the Controller [22-04-2024(online)].pdf 2024-04-22
1 201821010428-STATEMENT OF UNDERTAKING (FORM 3) [21-03-2018(online)].pdf 2018-03-21
2 201821010428-US(14)-HearingNotice-(HearingDate-24-04-2024).pdf 2024-04-05
2 201821010428-PROVISIONAL SPECIFICATION [21-03-2018(online)].pdf 2018-03-21
3 201821010428-PROOF OF RIGHT [21-03-2018(online)].pdf 2018-03-21
3 201821010428-CLAIMS [29-03-2022(online)].pdf 2022-03-29
4 201821010428-POWER OF AUTHORITY [21-03-2018(online)].pdf 2018-03-21
4 201821010428-FER_SER_REPLY [29-03-2022(online)].pdf 2022-03-29
5 201821010428-FORM 13 [29-03-2022(online)].pdf 2022-03-29
5 201821010428-FORM 1 [21-03-2018(online)].pdf 2018-03-21
6 201821010428-OTHERS [29-03-2022(online)].pdf 2022-03-29
6 201821010428-DRAWINGS [21-03-2018(online)].pdf 2018-03-21
7 201821010428-RELEVANT DOCUMENTS [29-03-2022(online)].pdf 2022-03-29
7 201821010428-DECLARATION OF INVENTORSHIP (FORM 5) [21-03-2018(online)].pdf 2018-03-21
8 201821010428-FER.pdf 2021-10-18
8 201821010428-ENDORSEMENT BY INVENTORS [20-03-2019(online)].pdf 2019-03-20
9 Abstract1.jpg 2020-07-15
9 201821010428-DRAWING [20-03-2019(online)].pdf 2019-03-20
10 201821010428-COMPLETE SPECIFICATION [20-03-2019(online)].pdf 2019-03-20
10 201821010428-ORIGINAL UR 6(1A) FORM 1-210519.pdf 2020-01-10
11 201821010428-FORM 18 [25-10-2019(online)].pdf 2019-10-25
11 201821010428-Proof of Right (MANDATORY) [21-05-2019(online)].pdf 2019-05-21
12 201821010428-FORM 18 [25-10-2019(online)].pdf 2019-10-25
12 201821010428-Proof of Right (MANDATORY) [21-05-2019(online)].pdf 2019-05-21
13 201821010428-COMPLETE SPECIFICATION [20-03-2019(online)].pdf 2019-03-20
13 201821010428-ORIGINAL UR 6(1A) FORM 1-210519.pdf 2020-01-10
14 201821010428-DRAWING [20-03-2019(online)].pdf 2019-03-20
14 Abstract1.jpg 2020-07-15
15 201821010428-ENDORSEMENT BY INVENTORS [20-03-2019(online)].pdf 2019-03-20
15 201821010428-FER.pdf 2021-10-18
16 201821010428-DECLARATION OF INVENTORSHIP (FORM 5) [21-03-2018(online)].pdf 2018-03-21
16 201821010428-RELEVANT DOCUMENTS [29-03-2022(online)].pdf 2022-03-29
17 201821010428-DRAWINGS [21-03-2018(online)].pdf 2018-03-21
17 201821010428-OTHERS [29-03-2022(online)].pdf 2022-03-29
18 201821010428-FORM 1 [21-03-2018(online)].pdf 2018-03-21
18 201821010428-FORM 13 [29-03-2022(online)].pdf 2022-03-29
19 201821010428-POWER OF AUTHORITY [21-03-2018(online)].pdf 2018-03-21
19 201821010428-FER_SER_REPLY [29-03-2022(online)].pdf 2022-03-29
20 201821010428-PROOF OF RIGHT [21-03-2018(online)].pdf 2018-03-21
20 201821010428-CLAIMS [29-03-2022(online)].pdf 2022-03-29
21 201821010428-US(14)-HearingNotice-(HearingDate-24-04-2024).pdf 2024-04-05
21 201821010428-PROVISIONAL SPECIFICATION [21-03-2018(online)].pdf 2018-03-21
22 201821010428-STATEMENT OF UNDERTAKING (FORM 3) [21-03-2018(online)].pdf 2018-03-21
22 201821010428-Correspondence to notify the Controller [22-04-2024(online)].pdf 2024-04-22

Search Strategy

1 SearchStrategyMatrixE_06-08-2021.pdf