Sign In to Follow Application
View All Documents & Correspondence

Method, System & Apparatus For Machine Vision Drone Swarm Capable Of Doing Search, Rescue Operations And Firefighting

Abstract: According to one aspect, Machine vision drone swarm is a VTOL drone specifically a quadcopter swarm that has the ability of vision-based decision making like humans and beyond that it has vision capabilities and also equipped with the thermal vision capabilities. Further including, the swarm can be deployed in any congested area and the swarm can itself detect and recognize objects even in closed buildings by search and rescue operations/missions and on the same principle it can be used for deploying fire extinguisher capsules to clear path. Addition to this, drone is also equipped with cameras, GPU, an extinguisher capsule or other rescue items and capability to survive even in a extreme temperatures which makes it more durable, powerful, intelligent, compact and impactful. Swarm takes collective decisions using deep Learning in real time with onboard processing capabilities. MVDS will be operated and supervised via ground control station (GCS).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 July 2020
Publication Number
29/2021
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
advsyedasif@gmail.com
Parent Application

Applicants

ELIGHT SPM (OPC) PRIVATE LIMITED
D 77/C Thokar No 8, Shaheen Bagh, Jamia Nagar, Near Taiyad Masjid DELHI South Delhi DL 110025
Mohammad Anas
D 77/C Thokar No 8, Shaheen Bagh, Jamia Nagar, Near Taiyad Masjid DELHI South Delhi DL 110025

Inventors

1. Mohammad Anas
D 77/C Thokar No 8, Shaheen Bagh, Jamia Nagar, Near Taiyad Masjid DELHI South Delhi DL 110025

Specification

DESC:
FIELD OF INVENTION

[0001] Embodiment of the present disclosure relate to vision-based drone swarms that are capable of performing search and rescue missions and fire-fighting solutions even in congested areas without relying on GNSS / GPS i.e. drone can fly in complex places even if the space for flying is not normally sufficient.

BACKGROUND OF INVENTION

[0002] Conventional fire-fighting process is less effective and a high cost and time- consuming method. When it comes to congested areas which are common in India and many parts of the world fire-Brigade fails to reach on time due to traffic on ground and congested areas which leads to high number of death causalities during fire accidents.

[0003] Drone or unmanned aerial vehicles UAVs which are remotely controlled systems are getting popular rapidly due to their multiple applications in different industries, like milestone delivery, healthcare, survey, surveillance and much more. With the advancement of other technologies like AI, ML, DL and swarming robots’ integration with drones. Drone industry opens new doors to extended possibilities.

[0004] As deep learning (DL) is emerging with low power consuming and high- performance GPUs for mobile robotics applications their integration with drones highly enhance the capabilities of drones in real time.

[0005] Some of cited prior art are as follows;

i. WO2016195320A1, discloses an invention of a fire extinguishing firefighting drone, in case of a fire at a house, a building and the like, can be rapidly deployed at the start of a fire to extinguish the fire early, and which connects to a central control center to be operated remotely as an unmanned drone. However, the invention, fails to disclose that it is also capable of doing search and rescue to reduce death causalities by prioritizing machine learning based signal s to drones to extinguish fire in the area where fire victims are seeking help.
ii. US20130134254A1, disclose an invention of a UAV fire-fighting system, an unmanned aerial vehicle (UAV) designed to extinguish fires from 3 the air while remaining tethered to the ground via a tether system fashioned to provide the UAV with power and extinguishant. However, the invention fails to disclose that, it lacks swarm drone coordination that is based vision capabilities and not on GNSS / GPS.
iii. WO2014080385A2, discloses an invention of a Firefighters drone arrangement in which an arrangement for a set of drones to be used in fighting fires in high buildings, or unreachable forests. However, the invention fails to disclose that, it lacks in machine learning algorithm which may allow it to function in complex spaces and congested areas where fire brigade will take more than usual time.

[0006] The existing system provides drone swarm working in GPS technology and a few are working on vision technology. Our solution is capable of doing vision-based search and rescue operations and firefighting. It uses extinguisher capsule to extinguish fire efficiently without human dependency in the fire accident area. The existing systems are not capable of taking collective decision for performing artificial intelligence (AI) based autonomous search and rescue missions.

OBJECTIVE OF INVENTION
[0007] It is an object of the present disclosure which provides machine vision-based drone swarm which can also be transferred into small vision machine-based drone swarm with an extinguisher capsule for extinguishing fire.
i. It is an object of the present disclosure which provides vision-based drone swarm that can operate in congested area where GPS does not work properly and big drones cannot enter due to the space complexity.
ii. It is an object of the present disclosure which reduces firefighter presence in the fire accidents areas to extinguish fire, it reduces death casualties by prioritizing machine learning based signals to drones swarm to extinguish fire in the area where fire victims are seeking help.
iii. It is an object of the present disclosure which provide Drones swarm that are very efficient in search and rescue operations as they work simultaneously on different accidents spots.

[0008] Another main objective is to save the risk of life by providing GCS to monitor, deploy, automatically and safely return the swarm to base.

SUMMARY OF INVENTION
[0009] Embodiment described herein relate to search and rescue operations and on the same principle the drone can be a firefighting machine vision-based drone swarm which can be transferred into a small vision-based drone swarm.

In the present implementation the drone is specifically a quadcopter with multiple cameras, thermal camera, extinguisher capsule, dropping mechanism and high temperature casing, fixed base, landing gears and on-board image processing GP
[0010] In the present implementation multiple cameras give real time 360’ coverage of quadcopter and process the collective image data in real time to make a collision free vision-based flight of swarm which is an independent of GPS.

[0011] Additionally, In the present implementation Swarm will be deployed and controlled by GCS which is used to control and monitor the complete swarm. GCS will be capable of creating virtual or incident location via called location or in-built GPS whereas, if the swarm lost the GPS signals at any point of time while traveling to the incident location then it would automatically switch to the vision-based flight mechanism in real time.

[0012] In the present implementation once the swarm is in an incident location it can identify life in fire and capable of deploying an extinguisher capsule in a manner so that the exit path of a person can be cleared from fire.

[0013] Swarms have the capability of coordination with each other like if a drone in a warm environment identifies a life at a certain location then it can call nearby drones to help him extinguish fire to make it precise and time effective.

BRIEF DESCRIPTION OF DRAWNING

[00014] The present Embodiment description can be understood and if the same is understood by the way of two diagrams explaining the embodiment.
Fig.1 Shows the flowchart of the overall function of the system

Fig.2 Shows the block diagram of the unit drone internal hardware and software system.

[0015] Fig.1 (refer the drawing) Shows the flowchart of the overall function of the system in one aspect where we assume that system starts 101 when triggered by the user using mobile application 102 which further transfer signal using wireless communication 103 then the signal will be received and message will be displayed in GCS 104 with required information like the location of an incident number of triggers/calls etc, also it will perform predefined instruction 105 that trigger alarms and security system.106 indicates manual processes that involve battery installation and preflight checkup. Now based on the data received from user mobile application GCS create a virtual offline map considered as multiple processes 107 that is creating and uploading missions to drone swarm. Now only one drone will be deployed from the base for the first time as mentioned in 108 then the drone will reach target location 109. After reaching the target location drone have to take decision 110 using Artificial Intelligence if the fire is not 111 detected then the drone will return to launch location (base station) 112. And the program will stop 113.

[0016] Now considering the second condition if the fire is detected 114 by the first drone then the drone will send signal 115 to perform two parallel processes 116 first one is to perform multiprocess 116 for complete swarm deployment that will connect GCS 104 via junction 118. The second process is another multiple processes 119 that will perform search and rescue operation, creating 3d maps of location by exploring inside of the incident location and detecting people 120 in order to rescue people. The drone will estimate and deploy an extinguisher capsule as mentioned in predefined process 121 after deployment decision 122 has to be taken based on the situation, if the fire is their 123 then the message will be displayed on GCS 104 via junction 118. For the second condition, if the fire is not 124 detected then swarm will return to launch (RTL) 112 and program will stop 113.

[0017] Fig.2 (refer the drawing) Shows the block diagram of the unit drone internal hardware and software system. Blocks with dotted line 201 represent GPU systems used for on board processing and running neural networks that make drone AI capable. GPU systems broadly consist of 202 hardware motherboard blocks and software blocks 203. Motherboard 202 or any breakout board compatible with GPU module 237 integration, and have an onboard processor as a sub processing system 236 that allows connecting with other systems via input / output data port 237 and helps in regular processing. Now this hardware runs some Software 203 that consists of multiple layers like operating system 222 below that we have application module 223 that consist of programs and features which is connected with database module 224 keeps all data including mission module 225 that have flight plan and data. We also require a contingency module for monitoring and handling contingency events. For example, a contingency module may detect that a drone has flown or is flying out of VLOS from a ground operator and inform the flight controller module 204 to perform a contingency action. The ultimate objective of GPU system 201 is to collect data from camera(s) 207 sensor(s) that may require an array module 206 for connecting with GPU system 201 via input / output system 237 and feed that real time data to neural network for taking real time on board decision like providing information to flight controller module 204 about path for flying and objects to avoid as drones moves and circumstances changes hence GPU system 201 computing trillions of instructions per second to help in defining the operating parameter of drone.
[0018] After receiving information in flight controller 204 from GPU system 201 regarding operation or action that is required to be taken in order to move the drone safely . The flight controller module 204 provides the information to processor 227 via input / output system 228 that includes but not limited to the path navigation and finding , obstacle avoiding , object recognition. Processor 227 also takes values from IMU (inertial measurement unit) 229 that includes but not limited to accelerometer, gyro meter, bero meter. Values of IMU 229 sensor helps drone in getting orientation and real time kinematics these values along with the data received from GPU system 201 and may include data of receiver module 220, GPS module 221. After computing data in processor 227 the flight control system 204 gives commands to ESC 213 via I/O system 228 that further drives motor(s)/ acuatures 212 or controllers 217 to further drive propers and brings the drone in motion accordingly. Considering the fact that the drone is powered by battery 208 via power module 209 and PDB 210. Drone may have VTX 218 for live video transmission of mission and Telemetry 219 for flight data and communication as separate modules or combined modules including RX 220 for manual inputs via remote.
The flight control system is capable of integrating any external hardware like NPNT (no permission no take off) GSM module or any as directed by DGCA if any.

[0019] Fig.3 (refer the drawing) Represent a diagram of MVDS initial action taken by the system by deploying a single drone 307 on the target location 303 from the base station 304 as per the Fig.1 flowchart explanation earlier. In this diagram, we have represented buildings 301, 302 & 303 as a resident tower for example and we have assumed that there is a fire in 303 towers and a person 308 form the tower 303 sends a signal via the mobile application 310 of the system or alarm from fire safety system or it can be a manual targeted deployment by a person 305 from the base station in some case. Now a single drone 307 from the swarm will be deployed from the base station 304 via path 306 defined by virtual map and performs the initial survey as discussed earlier in flowchart 108. In the initial survey, a single drone 307 will visit the target location 303 defined by the GCS virtual map ideally or manually 305 in some cases. It will perform detection and recognition operation autonomously using vision-based navigation in and around the building/ target location. Once the fire is detected by the drone, then it will further send the signal to base station GCS for further deployment of complete swarm on the target location 303.
[0020] Fig.4 (refer the drawing) we have represented complete swarm deployment from the base station to the target location in continuation of fig.3 which shows single drone deployment on target location and verification of incident. all the drones in a swarm will follow the same path covered by the first agent (drone) and defined by GCS by creating a virtual offline map.

[0021] Fig.5 (refer the drawing) Represent the deployment of MVDS inside a building/ target location 501. agents/ drones in swarm 502,503,504,505,506,509 will scatter around the location to perform search and rescue missions. agents/Drone 506 will autonomously fly and find life 508 in the fire 507 area using AI, once detected will deploy/ release the extinguishers capsule in the fire 507 such that it clears the path of the person 508 stuck in fire 507. After deployment the drone will analyze the impact and feed the data to nearby agents 505,509, 504 to do the same procedure until the fire completely gets extinguished.

DETAILED DESCRIPTION

Aspects of the present disclosure of Machine vision drone swarm consists of many components and described hereinabove with reference to figures and diagrams.

[0022] Swarm is scalable it can be of any number of drones as per the need. It will be understood that each drone in the swarm is quadrotor x-mode configured which further consist of conventional propulsion system with the combination of BLDC, ESC, propellers, FC, battery, receiver, telemetry, cameras and also include advance hardware like on board GPU, interface module, thermal camera, extinguisher capsule, dropping mechanism and heat resistance casing.

[0023] The dropping machines consist of extinguisher capsules, mechanisms that have the ability of targeting and hitting extinguisher capsules precisely and accurately on the target location. This mechanism can contain more than one capsule at a time and can be reloaded once empty. The capsule storage can be used differently in different search and operation as the utility of the present system is not limited to firefighting operation but can be used for other search and rescue operation with the above explained process flow.

[0024] Beyond flying a quadcopter on swarm technology it can create real time 3D model of area with the swarm which is exploring different locations at the same time and make it live on GCS.

[0025] Swarm of drones make sure that they do not collide into each other or any surrounding object while flying using machine vision with the help of neural networks running in GPU and real time processing with high speed and accuracy.

[0026] The structure of quadcopter consists of base and casing where casing consists of high temperature resistive materials to make the drone survive around fire.

[0027] The ground control station (GCS) connected via telemetry link with the swarm. It controls coordinate and represents drone swarm data in real time on ground.

CLAIMS:
WE CLAIM,
1. A swarm drone method, system and apparatus configured and comprising:
? A search & rescue operations/ mission;
? 360-degree vision cameras for coverage; [ Fig.2 (207)]
? 3D map for exploring different locations live with GCS [ Fig.2 (205)]
? Speed of up to 100km/hrs. to cover the distance between base station and target location. [ Fig.2 (212 & 213)]
? structure constructed of high temperature resistance carbon fiber that provides extreme temperature durability; wherein it allows the swarm drone to survive itself in a high temperature;
? Flying capability in congested areas, for allowing swarm drone to fly even in the congested or complex area where a simpler or any other drone cannot fly; [ Fig.2 (201)]
? Vision capabilities along with the thermal vision capabilities and, [ Fig.2 (206, 207, 215)]
? Capable of extinguishing fire in search and rescue operations [ Fig.2 (207)]

2. A swarm drone claimed in claim 1, wherein the time effective swarm work in a distributed manner that has a range of flying upto 100km/hrs.

3. A swarm drone claimed in claim 1, wherein it builds on VTOL technology wherein it is capable of take-off and landing in complicated areas such as congested or complexed one like malls, commercial areas, markets etc.
4. A swarm drone claimed in claim 1, wherein the Vision based GPU powered technology enables it to fly in congested areas with high accuracy where GPS fails which makes flight more stable and robust.
5. A swarm drone claimed in claim 1, wherein it provides multiple cameras which may give 360-degree coverage of quadcopter to collect image data in Realtime.
6. The swarm drone claimed in claim1, wherein the thermal vision with object detection and recognition, wherein the swarm drone may quickly detect and recognize the objects through its vision capabilities.
7. The swarm drone claimed in claim 1, wherein it may find paths automatically on image data collected in real time with AI and neural network technology.
8. The swarm drone claims that it is independent of ground traffic and congested area challenges.
9. The swarm drone claimed in claim 1, wherein it is stable to fly and can survive even at a high temperature.
10. The swarm drone claims that it may easily enter into a small house or close building wherein big drones or conventional methods can't reach.
11. The swarm drone claimed in claim1, wherein it has 3D maps wherein it can create real time 3D models of an area and explore different locations while making it live on GCS at the same time.
12. The swarm drone claimed in claim 1 wherein it is effective in high rise where conventional methods take more time and pose inefficiency.
13. The swarm drone claims that it has a capability of carrying and dropping small payload i.e. fire extinguisher capsules on targeted locations where it can create maximum impact to save a number of lives.

Documents

Application Documents

# Name Date
1 202011032262-CLAIMS [10-10-2024(online)].pdf 2024-10-10
1 202011032262-PROVISIONAL SPECIFICATION [28-07-2020(online)].pdf 2020-07-28
2 202011032262-FORM FOR STARTUP [28-07-2020(online)].pdf 2020-07-28
2 202011032262-COMPLETE SPECIFICATION [10-10-2024(online)].pdf 2024-10-10
3 202011032262-FORM FOR SMALL ENTITY(FORM-28) [28-07-2020(online)].pdf 2020-07-28
3 202011032262-FER_SER_REPLY [10-10-2024(online)].pdf 2024-10-10
4 202011032262-OTHERS [10-10-2024(online)].pdf 2024-10-10
4 202011032262-FORM 1 [28-07-2020(online)].pdf 2020-07-28
5 202011032262-FORM 3 [11-06-2024(online)].pdf 2024-06-11
5 202011032262-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-07-2020(online)].pdf 2020-07-28
6 202011032262-FER.pdf 2024-04-10
6 202011032262-DRAWING [05-04-2021(online)].pdf 2021-04-05
7 202011032262-COMPLETE SPECIFICATION [05-04-2021(online)].pdf 2021-04-05
7 202011032262-AMMENDED DOCUMENTS [27-11-2023(online)].pdf 2023-11-27
8 202011032262-FORM-9 [21-05-2021(online)].pdf 2021-05-21
8 202011032262-FORM 13 [27-11-2023(online)].pdf 2023-11-27
9 FORM 1 NEW.pdf 2021-10-18
9 202011032262-FORM 18A [27-11-2023(online)].pdf 2023-11-27
10 202011032262-FORM-26 [27-11-2023(online)].pdf 2023-11-27
10 202011032262-STARTUP [27-11-2023(online)].pdf 2023-11-27
11 202011032262-FORM28 [27-11-2023(online)].pdf 2023-11-27
11 202011032262-POA [27-11-2023(online)].pdf 2023-11-27
12 202011032262-FORM28 [27-11-2023(online)].pdf 2023-11-27
12 202011032262-POA [27-11-2023(online)].pdf 2023-11-27
13 202011032262-FORM-26 [27-11-2023(online)].pdf 2023-11-27
13 202011032262-STARTUP [27-11-2023(online)].pdf 2023-11-27
14 202011032262-FORM 18A [27-11-2023(online)].pdf 2023-11-27
14 FORM 1 NEW.pdf 2021-10-18
15 202011032262-FORM 13 [27-11-2023(online)].pdf 2023-11-27
15 202011032262-FORM-9 [21-05-2021(online)].pdf 2021-05-21
16 202011032262-AMMENDED DOCUMENTS [27-11-2023(online)].pdf 2023-11-27
16 202011032262-COMPLETE SPECIFICATION [05-04-2021(online)].pdf 2021-04-05
17 202011032262-DRAWING [05-04-2021(online)].pdf 2021-04-05
17 202011032262-FER.pdf 2024-04-10
18 202011032262-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-07-2020(online)].pdf 2020-07-28
18 202011032262-FORM 3 [11-06-2024(online)].pdf 2024-06-11
19 202011032262-OTHERS [10-10-2024(online)].pdf 2024-10-10
19 202011032262-FORM 1 [28-07-2020(online)].pdf 2020-07-28
20 202011032262-FORM FOR SMALL ENTITY(FORM-28) [28-07-2020(online)].pdf 2020-07-28
20 202011032262-FER_SER_REPLY [10-10-2024(online)].pdf 2024-10-10
21 202011032262-FORM FOR STARTUP [28-07-2020(online)].pdf 2020-07-28
21 202011032262-COMPLETE SPECIFICATION [10-10-2024(online)].pdf 2024-10-10
22 202011032262-PROVISIONAL SPECIFICATION [28-07-2020(online)].pdf 2020-07-28
22 202011032262-CLAIMS [10-10-2024(online)].pdf 2024-10-10
23 202011032262-US(14)-HearingNotice-(HearingDate-12-12-2025).pdf 2025-11-21

Search Strategy

1 SearchstrategyE_01-02-2024.pdf
2 202011032262_SearchStrategyAmended_E_SearchStrategyAE_19-11-2025.pdf