Sign In to Follow Application
View All Documents & Correspondence

Unmanned Aerial Vehicle (Uav) For Crisis Management

Abstract: The present disclosure provides an unmanned aerial vehicle (UAV) (100) that includes at least one chassis (102), a plurality of motors (104-1, 104-2, 104-3,104-4), each of the plurality of motors (104-1, 104-2, 104-3, 104-4) equipped with an electronic speed controller (ESC) (106) that is configured on the chassis (102) for regulating motor speed, a gimbal (108) configured on the chassis (102), wherein the gimbal (108) includes one or more image acquisition units (108A) for capturing stabilized visual data, and a flight controller (110) in communication with the above components and is configured on the chassis (102). Further, the UAV (100) includes a power source (112) for power requirements of the UAV (100), a Power Distribution Board for current distribution, and one or more sensors like GPS and telemetry to ensure precise navigation and data transmission to the flight controller (110) for assessing the damage, finding people, and observing post-disaster occurrences.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
14 October 2024
Publication Number
21/2025
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application

Applicants

Amrita Vishwa Vidyapeetham
Amrita Vishwa Vidyapeetham, Bengaluru Campus, Kasavanahalli, Carmelaram P.O., Bengaluru - 560035, Karnataka, India.

Inventors

1. POLU, Sai Nadh Reddy
7-829-1, Mangamoor Road, Viekara Colony 1st Line, Ongole Prakasam, Andhra Pradesh - 523001, India.
2. MATHI, Sri Chaithanya
12-1345, Road Number 14, Mahanadu, Sundaraiah Nagar, Tadepalli, Andhra Pradesh - 522501, India.
3. JAMBULA, Snehith Reddy
302, 4th Floor, Mitra Apartment, Near Neeru Pragati Park, Srinagar Colony, Anantapur, Andhra Pradesh - 515001, India.
4. YADAV, Sudha
16, Kiran Vatika, 4th Cross, SGR College Road, Veerappa Road, Marathahalli, Bengaluru, Karnataka - 560037, India.
5. POTTY, Syama Sankaranarayanan
483, Omkar, 4th Cross, 1st Phase, RK Township, Yarandahalli, Jigani, Bengaluru -560105, Karnataka, India.

Specification

DESC:TECHNICAL FIELD
[0001] The present disclosure relates to unmanned aerial vehicles. In particular, the present disclosure relates to an Unmanned Aerial Vehicle (UAV) for crisis management.

BACKGROUND
[0002] In today’s context of crisis management, the use of Unmanned Aerial Vehicles (UAVs) or drones, has turned out to be revolutionary. The UAVs have evolved from being mere entertainment and military-related applications to being essential tools in several civil aspects including, but not limited to, natural disasters, search and rescue missions, and policing as the UAVs can navigate to areas that may otherwise be inaccessible, offer real-time information and are capable of functioning in dangerous situations without endangering human life.
[0003] UAVs, commonly referred to as drones, have become indispensable tools in crisis management scenarios. While UAVs initially gained prominence in military applications and recreational activities, their function has rapidly expanded into critical civil functions. Their ability to access remote or dangerous areas, provide real-time data, and operate without risking human lives has revolutionized their application in various emergency situations.
[0004] UAVs are now being employed in natural disaster relief efforts, search and rescue missions, and law enforcement operations. Their versatility enables them to navigate through terrains that may otherwise be inaccessible due to environmental or structural conditions, such as in the aftermath of earthquakes, floods, or fires. The real-time information provided by UAVs allows for timely decision-making, aiding in faster response and more effective management of resources during a crisis.
[0005] However, despite their growing utility, current UAV technologies still face several limitations. For example, their operational range, battery life, and payload capacities are often restricted, which can hinder their performance in extended missions. Additionally, integration with existing emergency response systems, data management, and communication infrastructures poses challenges that need to be addressed.
[0006] Therefore, there is a need to mitigate the above limitations by providing a UAV for crisis management.

OBJECTS OF THE PRESENT DISCLOSURE
[0007] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0008] A general object of the present disclosure is to provide an unmanned aerial vehicle (UAV) that enhances disaster management by quickly assessing damage and aiding in response operations.
[0009] An object of the present disclosure is to provide a UAV capable of detecting and locating individuals during search and rescue missions under diverse environmental conditions.
[0010] An object of the present disclosure is to provide a UAV that improves surveillance capabilities, contributing to public safety and effective crime prevention.
[0011] An object of the present disclosure is to provide a UAV that enables efficient monitoring of post-disaster scenarios to support recovery and relief efforts.
[0012] An object of the present disclosure is to provide a UAV that ensures operational effectiveness by detecting heat signatures and identifying entities through barriers.
[0013] An object of the present disclosure is to provide a UAV that supports real-time data collection and analysis, significantly enhancing decision-making during critical situations.
[0014] An object of the present disclosure is to provide a UAV that minimizes risks faced by rescue teams by performing remote assessments in hazardous environments.
[0015] An object of the present disclosure is to provide a UAV capable of reducing casualties, injuries, and property damage during emergency response activities.
[0016] An object of the present disclosure is to provide a UAV that enhances efficiency in handling natural disasters, industrial accidents, and criminal events.
[0017] An object of the present disclosure is to provide a UAV integrated with advanced technologies for seamless operation in crisis management scenarios.
[0018] An object of the present disclosure is to provide a UAV that promotes safer and more effective policing and law enforcement operations.
[0019] An object of the present disclosure is to provide a UAV that paves way for future improvements in automated control, extended flight durations, and broader operational ranges.
[0020] An object of the present disclosure is to provide a UAV capable of bridging communication gaps through real-time connectivity with ground control systems.
[0021] An object of the present disclosure is to provide a UAV that facilitates compliance with data privacy and security requirements in critical operations.

SUMMARY
[0022] Aspects of the present disclosure relate to the field of unmanned aerial vehicles. In particular, the present disclosure relates to an Unmanned Aerial Vehicle (UAV) equipped with advanced imaging, sensing, and control systems for applications such as disaster management, search and rescue operations, surveillance, and post-disaster recovery.
[0023] An aspect of the present disclosure pertains to a UAV that includes a chassis to which multiple motors are coupled, each equipped with an electronic speed controller to regulate motor speed. A gimbal is attached to the chassis, incorporating image acquisition units such as high-definition cameras for capturing visual data and thermal cameras for capturing infrared emissions. The gimbal is stabilized by brushless motors to ensure steady image acquisition during flight. In addition, the UAV includes a flight controller operatively connected to the motors, the gimbal, and the image acquisition units. The flight controller consists of a learning module and a microcontroller. The learning module processes visual and thermal data using machine learning techniques to identify targets, including entities, objects, human silhouettes, and heat emissions. Also, the learning module classifies the visual and thermal data to prioritize targets based on type and urgency. The microcontroller receives mission data, including flight path, GPS coordinates, altitude, and ground clearance, integrates the data with identified targets to generate navigation and operational commands, and transmits mission and target data to a remote server.
[0024] In an aspect, the UAV may include sensors such as GPS, telemetry, gyroscope, barometer, and accelerometer, which are attached to the chassis to acquire and transmit flight parameters to the flight controller.
[0025] In an aspect, the UAV may include a radio receiver that enables bidirectional communication with the remote server, facilitating transmission and receipt of data for operational purposes.
[0026] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF THE DRAWINGS
[0027] The following drawings form part of the present disclosure and are included to further illustrate aspects of the present disclosure. The disclosure may be better understood by reference to the drawings in combination with the detailed description of the specific embodiments presented herein.
[0028] FIG. 1A illustrates an exemplary block diagram of an Unmanned Aerial Vehicle (UAV) for crisis management, according to embodiments of the present disclosure.
[0029] FIG. 1B illustrates an exemplary schematic view of the UAV for crisis management, according to embodiments of the present disclosure.
[0030] FIG. 2 illustrates an exemplary schematic view of a flight controller of the UAV, according to embodiments of the present disclosure.
[0031] FIGs. 3A to 3C illustrate exemplary schematic views of a gimbal and image acquisition units of the UAV, according to embodiments of the present disclosure.
[0032] FIG. 4A illustrates an exemplary view showing the UAV identifying humans with a square box, according to embodiments of the present disclosure.
[0033] FIG. 4B illustrates an exemplary view showing the UAV identifying humans with a thermal camera of the image acquisition unit, according to embodiments of the present disclosure.
[0034] FIG. 5 illustrates an exemplary representation of a block diagram illustrating a computing system, in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[0035] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0036] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0037] Aspects of the present embodiments describe an Unmanned Aerial Vehicle (UAV) for crisis management.
[0038] The UAVs in crisis management are equipped with a birds-eye view that enable UAVs provide an evaluation of the crisis location. The UAVs are highly developed drones with cameras and sensors that shoot high-definition images and videos that are essential for evaluation of the damage, search for potential risks, and search for the survivors, among other like applications. The integration of thermal imaging systems improves the functioning of the UAVs by allowing them to find heat signs through haze, fog, and dust or in building and forest fire collapses. Moreover, a learning module for imaging helps UAVs to process visual data with high accuracy to distinguish objects and human silhouettes in real-time.
[0039] The learning module improves the analytical functions of UAVs. This way, by providing large amounts of data to the learning module, the UAVs can be trained to effectively detect patterns, categorize objects, and make determinations based on image data for aiding in the process of assessment of the damages and the identification of victims, which significantly increases the speed of actions and the effectiveness of interventions. For example, after a natural disaster like an earthquake, using the UAV equipped with the learning module, wide areas can be quickly mapped, buildings most likely to collapse can be detected, and trapped persons’ location can be determined.
[0040] According to an aspect, the UAV includes at least one chassis, a plurality of motors such that each of the plurality of motors is equipped with an electronic speed controller (ESC) that is configured on the chassis for regulating motor speed, a gimbal configured on the chassis, wherein the gimbal includes one or more image acquisition units having high-definition and thermal cameras for capturing stabilized visual data, and a flight controller in communication with the above components and configured on the chassis. Further, the UAV includes a power source, where the power source may be a portable Li-Po battery for power requirements of the UAV, a Power Distribution Board for current distribution, ESCs to regulate motor speed, and one or more sensors like Global Positioning System (GPS) and telemetry to ensure precise navigation and data transmission to the control station for assessing the damage, finding people, and observing post-disaster occurrences. The flight controller includes a learning module configured to identify entities and people besides detecting heat through barriers.
[0041] One or more visual images and information are brought to the rescue teams using the UAVs, and then, the allocated resources are focused, and the rescue operation is well-organized. This can minimize the time spent in the process of finding and evacuating the victims and subsequently, in the delivery of medical aid, which contributes to saving lives and decreasing the total outcomes of the catastrophe. Further, the use of UAVs is effective in increasing the safety of the search and rescue teams by allowing the teams to evaluate the situation from a secure zone before they go to the risk areas.
[0042] Police services may utilize drones for surveillance, crowd monitoring, and assessing criminal scenes. The practical use of UAVs in incidents like riots, terrorist attacks, or search operations can be considered as a tactical bonus as it helps to regain control and preserve evidence without endangering officers’ lives. Furthermore, UAVs with night vision and thermal vision can be used in surveillance at night and in conditions of limited visibility. Moreover, the UAVs for crisis management provide situational awareness capability in near real-time, richer images, and advanced analysis making them important tools in the emergency response systems.
[0043] Referring to FIGs. 1A to 4B, a UAV (hereinafter referred to as “UAV 100”) is shown that identifies one or more targets in a region of interest (ROI). The one or more targets may include, but not limited to, entities, objects, human silhouettes, and heat emissions, enabling the UAV 100 to effectively perform tasks such as surveillance, search and rescue, and disaster management. The UAV 100 includes at least one chassis 102 on which a plurality of motors 104-1, 104-2, 104-3, 104-4 (interchangeably referred to as “motors 104” hereinafter), a gimbal 108, and a flight controller 110 are mounted. The chassis 102 of the UAV 100 serves as structural framework, providing support and stability to its components and ensuring optimal weight distribution and durability for efficient flight operations. Each of the plurality of motors 104 is equipped with an electronic speed controller (ESC) 106 mounted on the chassis 102. The ESC 106 is responsible for regulating speed of the motors 104 by precisely controlling the power supplied to them, ensuring stable and efficient flight performance. This allows the UAV 100 to execute precise maneuvers, maintain balance, and adapt to varying flight conditions.
[0044] In an embodiment, the gimbal 108 includes one or more image acquisition units 108A that include, but not limited to, at least one conventional camera, at least one high-definition camera, and at least one thermal camera. The cameras work together to capture stabilized visual and thermal data during flight, as illustrated in FIGs. 3A to 3C. The conventional camera captures basic visual data in the form of pixels, typically recording videos or still images, which are valuable for visual surveillance and object identification. The disclosed high-definition camera enhances this capability by providing higher resolution imagery, allowing for more detailed analysis of objects in the UAV’s field of view. The thermal camera operates based on heat emitted from objects or surfaces, recording infrared radiation to generate thermal data. The thermal data is visually represented using a color-coding scheme, enabling the identification of temperature variations across different objects or surfaces. The combination of these imaging technologies allows for both standard and thermal visualization, providing the UAV 100 with the ability to detect heat signatures in low visibility conditions, such as smoke or darkness.
[0045] In an embodiment, the gimbal 108 includes one or more brushless motors 108B that are configured to stabilize the image acquisition units 108A during flight. These brushless motors 108B allow for precise control of the camera orientation, ensuring that the cameras remain steady and maintain their focus on the designated area of interest. The stabilization is essential for obtaining clear and accurate visual and thermal data, particularly during turbulent flight conditions or when the UAV 100 is performing complex maneuvers. The brushless motors 108B provide smooth motion control, minimizing vibrations and shakiness, which are essential for reliable image and video capture.
[0046] In an embodiment, the flight controller 110 is in communication with the above components including the motors 104, the gimbal 108, and the image acquisition unit 108A. The visual and thermal data acquired by the image acquisition unit 108A is received by the flight controller 110 for further processing. The data is analyzed using a learning module 110A within the flight controller 110, which applies advanced mathematical algorithms and machine learning techniques to extract valuable information from the captured images. This allows the UAV 100 to identify objects, detect heat signatures, and recognize patterns, enabling more effective decision-making.
[0047] In an embodiment, the learning module 110A is configured to classify the received visual and thermal data, differentiating between the identified objects, and prioritizing the targets based on type and urgency. This classification enables the UAV 100 to focus on the most critical targets first, enhancing its response capabilities. In an exemplary embodiment, the UAV 100 may include two or more disclosed cameras. This configuration increases the probability of obtaining accurate visual data by providing redundancy, if one camera malfunctions, the other can take over, ensuring uninterrupted video feeds and continuous monitoring during critical missions. This redundancy improves reliability and performance of the UAV 100, especially in high-stakes or emergency situations.
[0048] In addition, a microcontroller 110B is attached to the flight controller 110, and the microcontroller 110B may include one or more processors that may be configured to execute machine-readable program instructions. Execution of the machine-readable program instructions by the processors may enable receiving the visual and thermal data to predict body temperatures of objects that may be identified as human, enhancing the UAV’s ability to conduct search and rescue operations. Among other capabilities, the processor may fetch and execute machine-readable/processor-executable instructions in a memory (not shown) operationally coupled with the flight controller 110 for performing tasks such as data processing, input/output processing, feature extraction, and/or any other functions. Any reference to a task in the present disclosure may refer to an operation being or that may be performed on data.
[0049] The memory may store one or more machine-readable/processor-executable instructions or routines, which may be fetched and executed to control the UAV 100. In some embodiments, the memory may include any non-transitory storage device including, for example, volatile memory such as Random Access Memory (RAM), or non-volatile memory such as an Erasable Programmable Read-Only Memory (EPROM), flash memory, and the like.
[0050] The microcontroller 110B may be configured to receive mission data to control the UAV 100. This mission data may include any or a combination of flight path, GPS coordinates, altitude information, ground clearance parameters, or the like. Once the mission data is received, the microcontroller 110B processes the mission data with the identified one or more targets to generate navigation and operational commands to move the UAV 100 to a location of the one or more targets. These commands ensure that the UAV 100 moves efficiently to the location of the identified targets, optimizing its path and conserving energy for critical tasks. Further, the microcontroller 110B transmits the received visual and thermal data of the identified one or more objects, and the mission data to a remote server (not shown). This transmission enables remote servers or control systems to access real-time data for analysis, decision-making, and coordination of subsequent actions.
[0051] In an embodiment, one or more sensors 114 can be attached to the at least one chassis 102 of the UAV 100, configured to acquire one or more parameters. The sensors 114 may include, but not limited to, at least one GPS sensor, at least one telemetry sensor, at least one gyroscope sensor, at least one barometer, and at least one accelerometer. These parameters encompass a range of flight-critical and environmental factors that are essential for the UAV’s operation. Depending on the mission or application, the sensors 114 acquire specific parameters that ensure the UAV 100 can adapt to its surroundings, maintain stability, and perform its tasks effectively.
[0052] For example, the GPS sensor provides parameters such as geographic coordinates and altitude, enabling precise navigation. The telemetry sensor gathers communication-related parameters, including signal strength and data transmission rates, to facilitate effective interaction between the UAV 100 and ground control. The gyroscope sensor measures angular velocity, and the accelerometer provides data on acceleration forces acting on the UAV 100. The barometer records atmospheric pressure to determine altitude changes.
[0053] These parameters are continuously transferred to the flight controller 110, which processes the data to make real-time adjustments. This ensures smooth and steady flight, allowing the UAV 100 to adapt dynamically to changing conditions or mission requirements. The ability to monitor and respond to these parameters enhances the UAV’s reliability, efficiency, and effectiveness in various applications.
[0054] In an embodiment, the UAV 100 can further include a radio receiver configured on the chassis 102 of the UAV 100 for demodulating signals from the microcontroller 110B or remote server on ground and processing them to the ESCs 106 to control the UAV 100 motion such that signals can be further analyzed on Ardupilot software for enabling control of the UAV 100.
[0055] In an embodiment, the flight controller 110 can act as a control center for the UAV 100 as shown in FIG. 1A. The flight controller 110 can include an Inertial Measurement Unit (IMU) (not shown), that processes parameters from sensors 114 such as gyroscopic and accelerometer sensors to determine position, orientation, and speed of the UAV 100. The flight controller 110 can sense, communicate, and control the flight of the UAV 100. The flight controller 110 can include a Pixhawk Processor as shown in FIG. 2 that can be in communication with all the sensors 114 of the UAV 100 and telemetry sub-systems. The UAV 100 can include magnetometers and barometers among sensors 114 in communication with the flight controller 110. The flight controller 110 can have an additional advantage of having a fold-out compass incorporated into the graphic on the front. Further, the flight controller 110 can be capable of handling telemetry systems gathering data automatically from the sensors 114, and returning the acquired data to the processor using communication protocols.
[0056] In an embodiment, the image acquisition unit 108A fixed on the gimbal 108 can get optical data in terms of video or record a particular area of interest as shown in FIGs. 4A and 4B. The flight controller 110 can receive data from the image acquisition unit 108A which can be analyzed by the learning module 110A to determine objects. The learning module 110A can predict the precise location of humans upon analyzing the feed from the captured visual data. The pixels from the video can be transformed through calculations to provide the system data which can be broken down and used for various applications. The stabilizing mechanism of the gimbal 108 provides for a smooth capturing angle. The UAV 100 include sensors 114 to eliminate additional, undesirable, but inevitable mechanical movements disturbing video capturing during the flight of the UAV 100. The UAV 100 can receive inputs from the gimbal 108 on the movements of the image acquisition unit 108A for distinguishing between movements that are intended from the vibrations that may occur. The gimbal 108 includes one or more brushless motors 108B to move the camera as a result of flight conditions for capturing stabilized aerial videos. The microcontroller 110B can be configured to control various cameras features including focus, exposure, and the like. Further, the integration of YOLO module with OpenCV displays information on temperature, structure, and identification of virus to the user of the UAV 100 such that the user upon considering the displayed information can opt for protective/ambush actions, deploy special forces of the government in case of unpleasant activity.
[0057] In an embodiment, the thermal camera of the image acquisition unit 108A can detect and record infrared emissions from objects targeted during capture. Objects with higher temperatures can be depicted in warmer spectrum of colours, more specifically the shades of yellow and orange, and objects with lower temperatures in blue and purple, and intensity of the colours can depend on the temperature of the Object. The thermal camera is compact and lightweight, which enhances design of the UAV 100 without compromising its weight. The raw data obtained from the images which are obtained from the thermal camera are analysed by the learning module 110A.
[0058] In an embodiment, microcontroller 110B of the flight controller 110 can be equipped with a mission planner software for controlling the UAV 100, where the flight path of the UAV 100 or the mission data can be downloaded/analyzed for generation of autopilot commands and fed to the UAV 100. The mission data can include several parameters like pitch angle, GPS coordinates, altitude, ground clearance, and speed. Microcontroller 110B in addition includes a GPS module (not shown) and a telemetry sensor (not shown) where the telemetry sensor can gather the positional and electric data and transmit wirelessly to microcontroller 110B to aid in the generation of autopilot commands.
[0059] In an embodiment, the flight controller 110 can be configured to enable the UAV 100 to navigate automatically without constant supervision using Ardupilot software. The user or host administrator of the UAV 100 can set the flight path comprising survey points and end point and autopilot commands as inputs to the flight controller 110. Using the image acquisition unit 108, the UAV 100 can perform reconnaissance and send all the findings to microcontroller 110B for better decision making by the user. The microcontroller 110B can be controlled by a joystick, the remote server or the control station on the ground by the user of the UAV 100.
[0060] In an embodiment, the UAV 100 includes a power source 112 that is a portable lithium-polymer (Li-Po) battery, chosen for its high energy density and lightweight nature. This works in conjunction with a Power Distribution Board (PDB) to ensure efficient distribution of electrical current to all UAV components. The ESC 106, which is powered by the PDB, regulate the speed of the motors 104 for precise control of the UAV's movement.
[0061] In an embodiment, the Li-Po battery can be a portable voltaic source of electric current resulting from chemical energy stored within its cells. The electric current can be channeled through the Power Distribution Board (PDB) depending on voltage and required electric current to all the above components of the UAV 100. The ESC 106 which is powered by the PDB control the speed and functionality of the motors 104. The power source 112 can transform the electrical energy of the PDB into mechanical energy for movement of the propellers allowing the UAV 100 to pitch, throttle, yaw, and roll during flight.
[0062] In an exemplary implementation, in an event of a large-scale natural disaster, such as an earthquake in a densely populated urban area, the proposed UAV 100 can be deployed to enhance response efficiency and effectiveness. The UAV 100 captures visual and thermal data while flying over the disaster zone. The learning module 110A processes this data in real-time, applying machine learning to identify critical targets, such as trapped individuals, damaged structures, or areas of high thermal activity. By prioritizing these targets based on urgency, the UAV 100 ensures that life-threatening situations are addressed promptly. The microcontroller 110B integrates mission data, including flight paths and GPS coordinates, enabling the UAV 100 to autonomously navigate to critical locations while adapting to evolving conditions.
[0063] The UAV 100 transmits real-time data to the remote server, enabling emergency teams to analyze and act swiftly. Thermal imagery aids in locating survivors, while high-definition visuals assess structural damage and identify safe zones. Acting as a communications relay when infrastructure is down, the UAV 100 ensures seamless connectivity. Further, sensors 114 and reliable power source 112 support extended operations, reducing risks to human responders and expediting resource allocation. This UAV 100 exemplifies a transformative tool in disaster management, minimizing casualties and property damage while enhancing the speed and efficiency of emergency response.

EXPERIMENTAL RESULTS
[0064] In an embodiment, the working of the image acquisition unit 108A and the YOLO module can be assessed by receiving one or more inputs from the captured high-resolution live time video graphic footage obtained from the Unmanned Aerial Vehicle 100 in the course of its mission. An input is produced in the form of videos captured using the conventional and thermal camera and then the videos are taken to a computer that can process the YOLO module. A test computer can then be used for the evaluations that will be replaced by the flight controller 110 in a real-life scenario. The YOLO module over the footage searches for objects and makes outlines over the objects as shown in FIGs. 4A and 4B.
[0065] In an embodiment, the YOLO module can be capable of object recognition based on an OpenCV library. The CNNs automatically discern patterns in the video feed from the cameras and overlay the potential subjects with bounding boxes. The YOLO module can be trained with identification datasets; the desired identification software can be fed with human forms from a conventional camera that is fixed on the gimbal 108 data using features such as texture, shape/structure, joints, and faces amongst others. The learning module 110A is further trained with thermal datasets to analyze the temperature obtained from the thermal camera, then search the database to check for presence of human by checking the temeprature whether between 36 degrees Celsius to 37 degrees Celsius as the difference between the local and International space station temperatures can slightly vary and falls within a range of 2 degrees Celsius.
[0066] The YOLO module can take the input image into the grid predict bounding boxes and try to class probabilities for objects in the grid area. The output can be a set of bounding boxes and associated class probabilities. A normal image can be converted into a grayscale for being analyzed by the YOLO module to find or identify the object.
[0067] Additionally, the image acquisition unit 108A can classify the objects as human or non-human depending on the identification of the materials using its pre-existing as well as the continually updated databases human-like traits where the image acquisition unit 108A identifies humans by using a standard camera with a green square box. The same detection and analysis can be performed by the learning module 110A on the input from the thermal camera’s footage; the output creating a bounding box on the objects as shown in FIGs. 4A and 4B.
[0068] The learning module 110A can categorize all the objects distinguished by their temperature. If the temperature of the object falls in the above-specified range then its reflectiveness would reduce and thus the amount of heat produced would be less than the above-specified range then the object captured can be classified as a human being. Additionally, a double-proofing method is used to verify the presence of people. The system not only detect the presence of a human based on his or her physical appearance, but also uses the average temperature range of a human being, thus increasing the accuracy of detection.
[0069] In an embodiment, one or more grayscale images captured by a normal camera with grayscale images with a square box as shown in FIG. 4A can confirm the human presence as well as initiate further actions if necessary. The grayscale images have been used for the detection of humans. Computer vision and image processing can be employed for pre-processing to be performed on a normal image where the pixel values are normalized to obtain a grayscale image. Hence, the two operations of edge detection and texture analysis are employed to enhance the main areas of interest.
[0070] Further, object detection and recognition techniques can be applied to identify humans. The data set has been trained by using the learning module 110A along with pre-trained deep learning models to handle the issue of variation and accuracy improvement. The data is filtered, and thresholds and noises are eliminated from kinds of data and are segmented into relevant regions. Subsequently, the captured image or video is acquired by the openCV and processed into computer vision. The image processing begins working by matching, identifying, and comparing the input by checking some feature extracted from the input with a database of human characteristics or by using some machine learning algorithms. Thus, with the help of grayscale images as well as adopting the most suitable learning module 110A could be effective in detecting humans.
[0071] Referring to FIG. 5, an exemplary computer system 500 is disclosed, that includes an external storage device 510, a bus 520, a main memory 530, a read-only memory 540, a mass storage device 550, a communication port 560, and a processor 570. Those skilled in the art will appreciate that
a computer system can include more than one processor and communication ports. Processor 570 can include various modules associated with embodiments of the present disclosure. Communication port 560 can be any of an RS-232 port for use with a modem-based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or future ports. Communication port 560 can be chosen depending on a network, such as a Local Area Network (LAN), Wide Area Network (WAN), or any network to which computer system connects.
[0072] Memory 530 can be Random Access Memory (RAM), or any other dynamic storage device commonly known in the art. Read only memory 540 can be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or BIOS instructions for processor 570. Mass storage 550 can be any current or future mass storage solution, which can be used to store
information and/or instructions. Exemplary mass storage solutions include but are not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial Bus (USB) and/or Firewire interfaces), one or more optical discs, Redundant Array of Independent Disks (RAID) storage, etc.
[0073] Bus 520 communicatively couples processor(s) 570 with the other memory, storage, and communication blocks. Bus 520 can be, e.g. a Peripheral Component Interconnect (PCI) / PCI Extended (PCI-X) bus, Small Computer System Interface (SCSI), USB or the like, for connecting expansion cards, drives, and other subsystems as well as other buses, such a front side bus (FSB), which connects processor 570 to software system.
[0074] Optionally, operator and administrative interfaces, e.g. a display, keyboard, and a cursor control device, can also be coupled to bus 520 to support direct operator interaction with computer system. Other operator and administrative interfaces can be provided through network connections connected through communication port 560. External storage device 510 can be any kind of external hard-drives, floppy drives, IOMEGA® Zip Drives, Compact Disc - Read Only Memory (CD-ROM), Compact Disc - Re-Writable (CDRW), Digital Video Disk - Read Only Memory (DVD-ROM). The components described above are meant only to exemplify various possibilities. In no way should the aforementioned
exemplary computer system limits the scope of the present disclosure.
[0075] While various embodiments of the present disclosure have been illustrated and described herein, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the scope of the disclosure.

ADVANTAGES OF THE PRESENT DISCLOSURE
[0076] The present disclosure provides an unmanned aerial vehicle (UAV) designed to enhance disaster management by rapidly assessing damage and assisting in response operations.
[0077] The present disclosure provides a UAV capable of efficiently detecting and locating individuals during search and rescue missions across diverse environmental conditions.
[0078] The present disclosure provides a UAV that strengthens surveillance operations, contributing to improved public safety and effective crime prevention.
[0079] The present disclosure provides a UAV that enables precise monitoring of post-disaster scenarios to aid in recovery and relief efforts.
[0080] The present disclosure provides a UAV that achieves operational effectiveness by detecting heat signatures and identifying targets through barriers.
[0081] The present disclosure provides a UAV that supports real-time data acquisition and analysis, enabling enhanced decision-making during critical situations.
[0082] The present disclosure provides a UAV that reduces risks to rescue teams by conducting remote assessments in hazardous environments.
[0083] The present disclosure provides a UAV capable of minimizing casualties, injuries, and property damage during emergency responses.
[0084] The present disclosure provides a UAV that improves efficiency of managing natural disasters, industrial accidents, and criminal incidents.
[0085] The present disclosure provides a UAV integrated with advanced technologies for streamlined operation in crisis management scenarios.
[0086] The present disclosure provides a UAV that facilitates safer and more effective policing and law enforcement activities.
[0087] The present disclosure provides a UAV that sets foundation for future advancements in automated control, extended operational ranges, and improved flight durations.
,CLAIMS:1. An unmanned aerial vehicle (UAV) (100) comprising:
at least one chassis (102);
a plurality of motors (104-1, 104-2, 104-3, 104-4), wherein each motor (104-1, 104-2, 104-3, 104-4) is equipped with an electronic speed controller (ESC) (106) and coupled to the at least one chassis (102) to regulate motor speed;
a gimbal (108) coupled to the at least one chassis (102), the gimbal (108) comprising one or more image acquisition units (108A); and
a flight controller (110) attached to the at least one chassis (102) and operatively coupled to the plurality of motors (104-1, 104-2, 104-3, 104-4), the gimbal (108), and the one or more image acquisition units (108A), the flight controller (110) comprising:
a learning module (110A) configured to:
receive visual data and thermal data from the one or more image acquisition units (108A); and
apply one or more machine learning techniques to the received visual and thermal data to identify one or more targets; and
a microcontroller (110B) configured to:
receive mission data to control the UAV (100);
integrate the mission data with the identified one or more targets to generate navigation and operational commands to move the UAV (100) to a location of the one or more targets; and
transmit the received visual and thermal data of the identified one or more targets, and the mission data to a remote server.
2. The UAV (100) as claimed in claim 1, wherein the one or more targets comprise any or a combination of: entities, objects, human silhouettes, and heat emissions.

3. The UAV (100) as claimed in claim 1, wherein the mission data comprises any or a combination of flight path, GPS coordinates, altitude information, and ground clearance parameters.

4. The UAV (100) as claimed in claim 1, wherein the learning module (110A) is further configured to:
classify the received visual and thermal data to differentiate between the identified one or more targets, and correspondingly prioritize the identified one or more targets based on at least one of type and urgency.

5. The UAV (100) as claimed in claim 1, wherein the one or more image acquisition units (108A) comprise at least one high-definition camera to capture the visual data.

6. The UAV (100) as claimed in claim 1, wherein the one or more image acquisition units (108A) comprise at least one thermal camera to capture infrared emissions from the one or more targets.

7. The UAV (100) as claimed in claim 1, wherein the gimbal (108) comprises one or more brushless motors (108B) to stabilize the image acquisition units (108A) during flight.

8. The UAV (100) as claimed in claim 1, further comprising one or more sensors (114) attached to the at least one chassis (102) and configured to acquire one or more parameters, wherein the acquired one or more parameters are transmitted to the flight controller (110).
9. The UAV (100) as claimed in claim 7, wherein the one or more sensors (114) comprise any or a combination of Global Positioning System (GPS) sensor, telemetry sensor, gyroscope sensor, barometer, and accelerometer.

10. The UAV (100) as claimed in claim 1, further comprising a radio receiver operatively coupled to the flight controller (110) and configured to enable bidirectional communication between the UAV (100) and the remote server to transmit the data.

Documents

Application Documents

# Name Date
1 202441077912-STATEMENT OF UNDERTAKING (FORM 3) [14-10-2024(online)].pdf 2024-10-14
2 202441077912-PROVISIONAL SPECIFICATION [14-10-2024(online)].pdf 2024-10-14
3 202441077912-FORM FOR SMALL ENTITY(FORM-28) [14-10-2024(online)].pdf 2024-10-14
4 202441077912-FORM 1 [14-10-2024(online)].pdf 2024-10-14
5 202441077912-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [14-10-2024(online)].pdf 2024-10-14
6 202441077912-EVIDENCE FOR REGISTRATION UNDER SSI [14-10-2024(online)].pdf 2024-10-14
7 202441077912-EDUCATIONAL INSTITUTION(S) [14-10-2024(online)].pdf 2024-10-14
8 202441077912-DRAWINGS [14-10-2024(online)].pdf 2024-10-14
9 202441077912-DECLARATION OF INVENTORSHIP (FORM 5) [14-10-2024(online)].pdf 2024-10-14
10 202441077912-Proof of Right [08-11-2024(online)].pdf 2024-11-08
11 202441077912-FORM-26 [13-01-2025(online)].pdf 2025-01-13
12 202441077912-FORM-5 [12-05-2025(online)].pdf 2025-05-12
13 202441077912-DRAWING [12-05-2025(online)].pdf 2025-05-12
14 202441077912-CORRESPONDENCE-OTHERS [12-05-2025(online)].pdf 2025-05-12
15 202441077912-COMPLETE SPECIFICATION [12-05-2025(online)].pdf 2025-05-12
16 202441077912-OTHERS [14-05-2025(online)].pdf 2025-05-14
17 202441077912-FORM-9 [14-05-2025(online)].pdf 2025-05-14
18 202441077912-FORM-8 [14-05-2025(online)].pdf 2025-05-14
19 202441077912-FORM 18 [14-05-2025(online)].pdf 2025-05-14
20 202441077912-EDUCATIONAL INSTITUTION(S) [14-05-2025(online)].pdf 2025-05-14