Sign In to Follow Application
View All Documents & Correspondence

System And Method Implemented In An Unmanned Aerial Vehicle For Automated Emergency Assistance And Evacuation

Abstract: The present disclosure relates to a system (100) implemented in an Unmanned Aerial Vehicle (UAV) (102) for automated emergency assistance and evacuation. The system (100) includes an image acquisition unit (112) to capture images of injured individuals, a colour sensor (114) to detect wound colouration and environmental colour, and at least one LED strip (116) that adjusts emitted light to blend with surroundings, enhancing UAV (102) stealth. A controller (130) analyzes images using a pre-trained machine learning model to classify injuries based on type and location, further assessing severity by combining classification results with wound colouration data. Upon determining that the severity exceeds a predefined threshold, a stretcher mechanism (118) is activated and lowered to evacuate the injured. The system (100) enables prioritized evacuation by categorizing injuries into critical, moderate, or minor conditions, offering rapid, intelligent, and autonomous response capabilities in battlefield, disaster, or remote rescue operations.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
15 May 2025
Publication Number
22/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Amrita Vishwa Vidyapeetham
Amrita Vishwa Vidyapeetham, Bengaluru Campus, Kasavanahalli, Carmelaram P.O., Bengaluru - 560035, Karnataka, India.

Inventors

1. GEORGE, Angelina
116, 4th Cross, Sundar Nagar, Gokula, Bengaluru - 560054, Karnataka, India.
2. JOSE, Alphonsa
#238, Phase 2, Daadys Southberg Layout, Kammasandra, Electronic City - Phase II, Bengaluru - 560100, Karnataka, India.
3. VIJJAPU, Aditya
#26, Sankalp, 12th Main, Muthyalanagar, Bengaluru - 560054, Karnataka, India.
4. D K, Niranjan
No. 578, Sri Virupaksha Nilaya, 15th Main, 4th Block, Nandini Layout, Bengaluru - 560096, Karnataka, India.
5. ANGAMUTHU ARULMANI, Nippun Kumaar
Flat No. 401, Aishwarya Bangalore Homes, 2nd Main, KPC Layout, Kasavanahalli, Bengaluru - 560035, Karnataka, India.

Specification

Description:TECHNICAL FIELD
[0001] The present invention relates to the field of unmanned aerial systems, and more particularly to a system and method implemented in an unmanned aerial vehicle (UAV) for providing automated emergency assistance and evacuation.

BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Unmanned aerial vehicles (UAVs), commonly referred to as drones, have increasingly been utilized across diverse fields such as surveillance, mapping, disaster management, and search-and-rescue operations. In military and emergency response settings, UAVs have demonstrated significant potential for operations such as real-time surveillance, health monitoring, and most notably, casualty evacuation from high-risk or inaccessible zones.
[0004] Recent technological advancements have enabled UAVs to support complex tasks including object detection, autonomous navigation, collision avoidance, and distributed task management. These capabilities have enhanced the utility of UAVs in critical scenarios, offering faster response times and minimizing human exposure to danger.
[0005] Despite these advancements, several challenges persist in the effective use of UAVs for emergency assistance and casualty evacuation. One major issue lies in the lack of integration of stealth mechanisms such as adaptive digital camouflage, which is crucial for operations in hostile or sensitive areas where visibility could compromise mission success. Furthermore, existing systems often lack sophisticated methods to analyze the condition of injured individuals and to prioritize evacuation based on injury severity, which can lead to suboptimal resource utilization and delayed medical response.
[0006] Additionally, the mechanical designs for current UAV-based evacuation systems often fail to consider optimized payload deployment mechanisms, especially in terms of real-time environmental adaptation and stable descent control for carrying injured individuals.
[0007] There is, therefore, a need for an improved UAV-based system that overcomes the aforementioned limitations, enhances operational stealth, assesses injury severity more accurately, and provides efficient casualty evacuation in diverse emergency scenarios.

OBJECTS OF THE PRESENT DISCLOSURE
[0008] A general object of the present disclosure is to provide a UAV-based system capable of autonomously detecting injured individuals in emergency environments.
[0009] An object of the present disclosure is to provide a solution for accurately assessing injury severity using image data and wound colouration.
[0010] Another object of the present disclosure is to provide an automated stretcher deployment mechanism for efficient casualty evacuation without manual input.
[0011] Another object of the present disclosure is to provide precise ground distance measurement using ultrasonic sensing to ensure safe stretcher deployment.
[0012] Another object of the present disclosure is to provide dynamic environmental camouflage through LED colour adjustment based on sensed surroundings.
[0013] Another object of the present disclosure is to provide machine learning-based classification of injuries for faster and more reliable triage decisions.
[0014] Another object of the present disclosure is to categorize injured individuals for prioritized evacuation.
[0015] Another object of the present disclosure is to provide emergency medical assistance in remote or hazardous areas where human access is limited.
[0016] Another object of the present disclosure is to provide an integrated UAV platform combining sensing, decision-making, and physical aid delivery.
[0017] Another object of the present disclosure is to provide improved response time and survival rates during battlefield, disaster, or natural calamity scenarios.

SUMMARY
[0018] An aspect of the present disclosure pertains to the field of unmanned aerial systems, and more particularly to a system and method implemented in an unmanned aerial vehicle (UAV) for providing automated emergency assistance and evacuation. The disclosure identifies injured individuals, assesses injury severity, and assists in their evacuation in critical environments such as disaster zones, warfields, or remote locations.
[0019] An aspect of the present disclosure pertains to a system implemented in an unmanned aerial vehicle for delivering emergency assistance and facilitating evacuation. The system includes an image acquisition unit configured to capture one or more images of an injured individual within a designated area. A colour sensor is integrated to detect wound colouration that indicates the severity of injuries and also to receive environmental colour data from the surroundings. At least one LED strip is operatively connected to the colour sensor and adjusts its emitted light to match the environmental colour for visual blending with the surroundings. The system also comprises a controller communicatively linked to the image acquisition unit and the colour sensor. This controller is operatively connected to a memory storing processor-executable instructions. Upon execution, these instructions enable the controller to analyze the captured images using a pre-trained machine learning model, classify the injuries by type and location, and determine the severity of the injuries based on both the classification results and wound colouration data. If the severity exceeds a defined threshold, the controller activates a stretcher mechanism to descend from the unmanned aerial vehicle and assist in evacuating the injured individual.
[0020] In an aspect, the pre-trained machine learning model is further configured to identify specific injury patterns and classify the injuries into various severity levels based on spatial wound features.
[0021] In an aspect, the controller also analyzes tonal variations in the wound colouration data to determine indicators such as bleeding intensity, bruising, or oxygen deficiency. These variations are compared against a predefined dataset of colour-severity mappings stored in memory. Based on this comparison and the injury classification, a severity score is computed using weighted scoring. The injured individual is further categorized into critical, moderate, or minor conditions for prioritized evacuation.
[0022] In an aspect, to enhance camouflage, the controller receives environmental colour data and controls at least one LED strip to dynamically adjust the emitted colour to match the surroundings, thereby achieving visual concealment.
[0023] In an aspect, the system includes an ultrasonic sensor connected to the controller. This sensor measures the distance between the unmanned aerial vehicle and the ground to aid in the safe deployment of the stretcher mechanism.
[0024] In an aspect, a gear motor is used to activate and control the vertical movement of the stretcher mechanism. The controller regulates this motor based on distance data from the ultrasonic sensor, ensuring accurate and stable deployment of the stretcher.
[0025] Another aspect of the present disclosure relates to a method implemented in an unmanned aerial vehicle (UAV) for emergency assistance and evacuation, the method includes capturing one or more images of an injured individual in a designated location using an onboard image acquisition unit. A colour sensor detects the colouration of the wound, which serves as an indicator of injury severity, and also captures environmental colour data from the surrounding area. Based on the environmental colour data, at least one LED strip adjusts its emitted colour to visually match the surroundings for camouflage purposes. The captured images are analyzed by a controller using a pre-trained machine learning model stored in memory. The controller classifies the injuries shown in the images into categories based on their type and location. Using the classification results and the wound colouration data, the controller determines the severity level of the injury. If this severity exceeds a predetermined threshold, the controller activates a stretcher mechanism attached to the unmanned aerial vehicle to descend and assist in evacuating the injured person.

BRIEF DESCRIPTION OF DRAWINGS
[0026] The accompanying drawings are included to provide a further understanding of the present disclosure and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure. The diagrams are for illustration only, which thus is not a limitation of the present disclosure.
[0027] FIG. 1 illustrates an exemplary network architecture of proposed system implemented in an unmanned aerial vehicle (UAV) for emergency assistance and evacuation, in accordance with an embodiment of the present invention.
[0028] FIG. 2 illustrates an exemplary block diagram of the proposed system implemented UAV for emergency assistance and evacuation, in accordance with an embodiment of the present invention.
[0029] FIG. 3 illustrates functional units of a processor associated with the proposed system for emergency assistance and evacuation, in accordance with an embodiment of the present invention.
[0030] FIG. 4 illustrates a flow diagram of the proposed method for emergency assistance and evacuation, in accordance with an embodiment of the present invention.
[0031] FIG. 5 illustrates an exemplary computer system in which or with which embodiments of the present disclosure are utilized in accordance with some embodiments of the present disclosure.

DETAILED DESCRIPTION OF THE PRESENT INVENTION
[0032] The following is a detailed description of embodiments of the disclosure represented in the accompanying drawings. The disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0033] Embodiment of the present disclosure pertains to the field of unmanned aerial systems, and more particularly to a system and method implemented in an unmanned aerial vehicle (UAV) for providing automated emergency assistance and evacuation.
[0034] An embodiment of the present disclosure pertains to a system implemented in an unmanned aerial vehicle for delivering emergency assistance and facilitating evacuation. The system includes an image acquisition unit configured to capture one or more images of an injured individual within a designated area. A colour sensor is integrated to detect wound colouration that indicates the severity of injuries and also to receive environmental colour data from the surroundings. At least one LED strip is operatively connected to the colour sensor and adjusts its emitted light to match the environmental colour for visual blending with the surroundings. The system also comprises a controller communicatively linked to the image acquisition unit and the colour sensor. This controller is operatively connected to a memory storing processor-executable instructions. Upon execution, these instructions enable the controller to analyze the captured images using a pre-trained machine learning model, classify the injuries by type and location, and determine the severity of the injuries based on both the classification results and wound colouration data. If the severity exceeds a defined threshold, the controller activates a stretcher mechanism to descend from the unmanned aerial vehicle and assist in evacuating the injured individual.
[0035] In an embodiment, the pre-trained machine learning model is further configured to identify specific injury patterns and classify the injuries into various severity levels based on spatial wound features.
[0036] In an embodiment, the controller also analyzes tonal variations in the wound colouration data to determine indicators such as bleeding intensity, bruising, or oxygen deficiency. These variations are compared against a predefined dataset of colour-severity mappings stored in memory. Based on this comparison and the injury classification, a severity score is computed using weighted scoring. The injured individual is further categorized into critical, moderate, or minor conditions for prioritized evacuation.
[0037] In an embodiment, to enhance camouflage, the controller receives environmental colour data and controls the LED strip to dynamically adjust the emitted colour to match the surroundings, thereby achieving visual concealment.
[0038] In an embodiment, the system includes an ultrasonic sensor connected to the controller. This sensor measures the distance between the unmanned aerial vehicle and the ground to aid in the safe deployment of the stretcher mechanism.
[0039] In an embodiment, a gear motor is used to activate and control the vertical movement of the stretcher mechanism. The controller regulates this motor based on distance data from the ultrasonic sensor, ensuring accurate and stable deployment of the stretcher.
[0040] Another aspect of the present disclosure relates to a method for emergency assistance and evacuation for the UAV, as implemented by the system.
[0041] Referring to FIGs. 1 and 2, an exemplary network architecture of the proposed system 100 implemented in an unmanned aerial vehicle (UAV) 102, and an exemplary view of the UAV 102 respectively for emergency assistance and evacuation are disclosed. The system 100 assists during emergency situations such as on the battlefield, in disaster zones, or remote rescue locations. The UAV 102 includes a first propeller 104-1, a second propeller 104-2, a third propeller 104-3, and a fourth propeller 104-4, as shown in FIG. 2. Each propeller is individually attached to a corresponding direct current (DC) motor, 106-1, 106-2, 106-3, and 106-4. These DC motors generate required thrust for flight, enabling the UAV to hover, maneuver, and maintain stability during operations. The system 100 further includes a set of rechargeable batteries 108-1, 108-2, 108-3, and 108-4, with each battery supplying power to its respective motor. This configuration allows for balanced power distribution and enhances operational reliability, especially during critical missions such as battlefield casualty evacuation or rescue in disaster-stricken areas.
[0042] The proposed system 100 includes an image acquisition unit 112 configured on the UAV 102 to acquire one or more images of injured individuals within a target location. The image acquisition unit 112 may include at least one camera, such as a visible spectrum camera, infrared camera, or multispectral imaging device, capable of capturing real-time visual data of the environment and injured individual.
[0043] In addition, the system 100 further includes a colour sensor 114 attached to the UAV 102. The colour sensor 114 is configured to perform dual functions. First, colour sensor 114 detects wound colouration on the injured individual, which is indicative of injury severity. This may include identifying shades associated with bleeding, bruising, or discolouration caused by oxygen deficiency. Such tonal variations in wound colouration serve as indicators for medical triage and assist in estimating the urgency of medical intervention. Second, the colour sensor 114 is configured to receive environmental colour data from surroundings of the UAV 102. This data reflects dominant colour tones of the environment in which the UAV is operating, such as terrain, vegetation, battlefield, or urban structures.
[0044] In an embodiment, at least one LED strip 116 is attached to the UAV 102 and operatively connected to the colour sensor 114. The LED strip 116 is configured to adjust its emitted colour in real time to match the detected surrounding environmental colour. This capability allows the UAV 102 to visually blend into its environment, enhancing stealth and reducing visibility during emergency operations. Such dynamic camouflage is particularly advantageous in combat zones, disaster sites, or remote rescue missions where concealment of UAV presence is desirable. For instance, For instance, the UAV 102 may camouflage itself on the battlefield by dynamically changing its colour, such as adapting to shades of green, brown, or grey, using the LED strip 116 controlled by a TCS3200 colour sensor 114, allowing the UAV 102 to blend seamlessly with the surrounding environment.
[0045] In an embodiment, the system 100 includes a stretcher mechanism 118 that is activated by a gear motor 120, both attached to the UAV 102. The activation depends on distance measured by an ultrasonic sensor 122. The ultrasonic sensor 122 is also attached to the UAV 102 and measures the distance between the UAV 102 and the ground. This measured distance assists the system 100 in deciding when it is safe to deploy the stretcher mechanism 118, ensuring that the stretcher mechanism 118 is released only when the UAV 102 is at correct altitude for safe and effective deployment.
[0046] In an embodiment, the system 100 includes a controller 130 operatively connected to the image acquisition unit 112, the colour sensor 114, the LED strip 116, the gear motor 120, and the ultrasonic sensor 122. The image acquisition unit 112 provides high-resolution input to the controller 130, enabling detection, tracking, and classification of injuries. The controller 130 analyses the images using a pre-trained machine learning model to assess the type, location, and severity of injuries based on both visual data and wound colouration received from the colour sensor. If severity exceeds a predefined threshold, the controller 130 activates the stretcher mechanism 118 to descend from the UAV 102 for evacuation of the injured individual. Further, the controller 130 is configured to display the detected colour of the surroundings and accuracy of the proposed machine learning model on a display device 124. The display device 124 is attached to the UAV 102 and is operatively coupled to the controller 130. The display device 124 can be a liquid crystal display (LCD), light-emitting diode (LED) screen, organic LED (OLED), or any other suitable visual interface configured to present real-time data including environmental colour information, wound severity classification, UAV status, and machine learning model performance metrics.
[0047] Referring to FIG. 3, exemplary functional units of the controller 130 associated with the proposed system 100 are disclosed. The controller 130 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the controller 130 may be configured to fetch and execute computer-readable instructions stored in a memory 304. The memory 304 may store one or more computer-readable instructions or routines, which may be fetched and executed for analysing images and providing automated emergency assistance. The memory 304 may include any non-transitory storage device including, for example, volatile memory such as Random Access Memory (RAM), or non-volatile memory such as an Erasable Programmable Read-Only Memory (EPROM), flash memory, and the like.
[0048] In an embodiment, the controller 130 may also include an interface(s) 306. The interface(s) 306 may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as Input/Output (I/O) devices, storage devices, and the like. The interface(s) 306 may provide a communication pathway for one or more components of the vehicle. Examples of such components include but are not limited to, processing engine(s) 308 and a database 310.
[0049] In an embodiment, the processing engine(s) 308 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 308. In other embodiments, the processing engine(s) 308 may be implemented by electronic circuitry. The database 310 may include data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 308. In some embodiments, the processing engine(s) 308 may include an image analysis module 312, a camouflage adaptation module 314, a stretcher deployment and control module 316, and other module(s) 318. The other module(s) 318 may implement functionalities that supplement applications/functions performed by the system 100.
[0050] In an embodiment, the image analysis module 312 is configured to receive high-resolution images of the injured individual within the target location, captured by the image acquisition unit 112. These images are analysed using a pre-trained machine learning model configured to detect injury patterns and classify them based on spatial features such as size, shape, and location of the wound. Simultaneously, the colour sensor 114 detects wound colouration, which is indicative of injury severity. The colour sensor 114 also captures environmental colour data to support camouflage functionality. The wound colouration data is analysed to identify tonal variations that may reflect bleeding intensity, bruising, or signs of oxygen deficiency. The image analysis module 312 compares these tonal variations with a predefined colour-severity mapping dataset stored in the memory 304. Based on this analysis, the image analysis module 312 computes a severity score by combining the classified injury type and wound colouration using a weighted scoring method. This severity score is further used to categorise the injured individual into one of three severity levels: critical, moderate, or minor, enabling prioritised and informed evacuation decisions.
[0051] For instance, when the proposed system 100 is utilized in the battlefield, the UAV 102 hovers over a wounded soldier lying near a rocky terrain. The camera 112 captures detailed visuals of the injury, such as a deep leg wound. The colour sensor 114 detects dark red colouration, indicating active bleeding. The image analysis module 312 analyses the image and wound tone, classifies the injury as severe and assigns a high severity score.
[0052] In an embodiment, the camouflage adaptation module 314 is configured to receive environmental colour data from the colour sensor 114. Based on this data, the camouflage adaptation module 314 controls the LED strip 116 to dynamically adjust the emitted colour to closely match the surrounding environment. This real-time colour adaptation helps visually camouflage the UAV 102, making it less detectable during sensitive operations. For instance, during a battlefield mission in a desert region, the UAV may hover over sandy terrain while assessing injured soldiers. The colour sensor 114 detects shades of beige and light brown in the surroundings. In response, the camouflage adaptation module 314 processes this data and instructs the LED strip to emit similar sandy hues. This blending allows the UAV 102 to operate discreetly, avoiding visual detection by enemy forces and increasing mission safety and effectiveness.
[0053] In an embodiment, the stretcher deployment and control module 318 is configured to activate the stretcher mechanism 118 attached to the UAV 102 when the severity level of the detected injury (identified by the image analysis module 312) exceeds a predefined threshold. The stretcher mechanism 118 is configured to descend from the UAV102 to enable safe and effective evacuation of the injured individual. The deployment process is managed by the gear motor 120 which controls the lowering of a stretcher. The stretcher deployment and control module 318 regulates this gear motor 120 based on real-time distance measurements provided by the ultrasonic sensor 122. This ensures that the stretcher is only deployed when the UAV 102 is at a safe and appropriate altitude above the ground, preventing damage to the mechanism or injury to the individual.
[0054] For instance, when the UAV 102 identifies a critically injured soldier lying on rough terrain after an explosion. The controller 130 assesses the injury as life-threatening, and the severity score exceeds the threshold for evacuation. The controller 130 further activates the stretcher mechanism 118. As the UAV 102 descends, the ultrasonic sensor 122 continuously measures the distance from the ground. When it reaches a suitable height, the gear motor 120 is triggered to lower the stretcher gently, allowing the casualty to be secured and evacuated safely.
[0055] Referring to FIG. 4, a method 400 for emergency assistance and evacuation, implemented in an unmanned aerial vehicle (UAV) 102 is disclosed. At step 402, the method 400 includes acquiring, by an image acquisition unit 112 configured on the UAV 102, one or more images of an injured individual within a target location.
[0056] Continuing further, at step 404, the method 400 includes detecting, by a colour sensor 114 configured on the UAV 102, wound colouration data indicative of the severity of the injury.
[0057] Continuing further, at step 406, the method 400 includes receiving, by the colour sensor 114, environmental colour data from the surroundings of the UAV 102.
[0058] Continuing further, at step 408, the method 400 includes adjusting, by at least one LED strip 116 operatively connected to the colour sensor 114, an emitted colour to match the surrounding environmental colour based on the received data, thereby enabling the UAV 102 to visually camouflage with the environment.
[0059] Continuing further, at step 410, the method 400 includes analysing, by a controller 130 configured in the UAV 102, the one or more acquired images using a pre-trained machine learning model stored in a memory 304.
[0060] Continuing further, at step 412, the method 400 includes classifying, by the controller 130, the acquired images into one or more categories based on the type and location of the injury. The controller 130 further analyses the wound colouration data to detect tonal variations indicative of bleeding intensity, bruising, or oxygen deficiency. The detected tonal variations are compared with a predefined colour-severity mapping dataset stored in memory. A severity score is computed by combining the classified injury type and the wound colouration analysis using a weighted scoring mechanism. Based on the severity score, the injured individual is categorised into critical, moderate, or minor conditions for prioritised evacuation.
[0061] Continuing further, at step 414, the method 400 includes determining, by the controller 130, whether the computed severity level exceeds a predefined threshold.
[0062] Continuing further, at step 416, the method 400 includes activating, by the controller 130, a stretcher mechanism 118 attached to the UAV 102 if the threshold is met. An ultrasonic sensor configured on the UAV 102 measures the distance between the UAV and the ground. Based on the measured distance, the controller 130 assists in deploying the stretcher mechanism at an appropriate height. The method 400 further includes activating a gear motor 120 configured to control the stretcher mechanism 118, and regulating the operation of the gear motor 120 based on the distance measured by the ultrasonic sensor.
[0063] FIG. 5 illustrates a block diagram of an example computer system 500 in which or with which embodiments of the present disclosure may be implemented.
[0064] As shown in FIG. 5, the computer system 500 may include an external storage device 510, a bus 520, a main memory 530, a read-only memory 540, a mass storage device 550, communication port(s) 560, and a processor 570. A person skilled in the art will appreciate that the computer system 500 may include more than one processor and communication ports. The processor 570 may include various modules associated with embodiments of the present disclosure. The communication port(s) 560 may be any of an RS-232 port for use with a modem-based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fibre, a serial port, a parallel port, or other existing or future ports. The communication port(s) 560 may be chosen depending on a network, such as a Local Area Network (LAN), Wide Area Network (WAN), or any network to which the computer system 500 connects. The main memory 530 may be random access memory (RAM), or any other dynamic storage device commonly known in the art. The read-only memory 540 may be any static storage device(s) including, but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or basic input/output system (BIOS) instructions for the processor 570. The mass storage device 550 may be any current or future mass storage solution, which may be used to store information and/or instructions.
[0065] The bus 520 communicatively couples the processor 570 with the other memory, storage, and communication blocks. The bus 520 can be, e.g., a Peripheral Component Interconnect (PCI) / PCI Extended (PCI-X) bus, Small Computer System Interface (SCSI), universal serial bus (USB), or the like, for connecting expansion cards, drives, and other subsystems as well as other buses, such a front side bus (FSB), which connects the processor 770 to the computer system 500.
[0066] Optionally, operator and administrative interfaces, e.g. a display, keyboard, and cursor control device, may also be coupled to the bus 520 to support direct operator interaction with the computer system 500. Other operator and administrative interfaces may be provided through network connections connected through the communication port(s) 560. In no way should the aforementioned exemplary computer system 500 limit the scope of the present disclosure.
[0067] Thus, the present disclosure provides the system 100 and method 400 for UAV-based emergency assistance and evacuation, enabling injury detection, severity assessment, adaptive camouflage, and safe stretcher deployment based on real-time sensor data and machine learning analysis.
[0068] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions, or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE PRESENT DISCLOSURE
[0069] The present disclosure provides a UAV-based system capable of autonomously detecting injured individuals in emergency environments.
[0070] The present disclosure provides a solution for accurately assessing injury severity using image data and wound colouration.
[0071] The present disclosure provides an automated stretcher deployment mechanism for efficient casualty evacuation without manual input.
[0072] The present disclosure provides precise ground distance measurement using ultrasonic sensing to ensure safe stretcher deployment.
[0073] The present disclosure provides dynamic environmental camouflage through LED colour adjustment based on sensed surroundings.
[0074] The present disclosure provides machine learning-based classification of injuries for faster and more reliable triage decisions.
[0075] The present disclosure is to categorize injured individuals for prioritized evacuation.
[0076] The present disclosure provides emergency medical assistance in remote or hazardous areas where human access is limited.
[0077] The present disclosure provides an integrated UAV platform combining sensing, decision-making, and physical aid delivery.
[0078] The present disclosure provides improved response time and survival rates during battlefield, disaster, or natural calamity scenarios.
, Claims:1. A system (100) implemented in an unmanned aerial vehicle (UAV) (102) for emergency assistance and evacuation, the system (100) comprising:
an image acquisition unit (112) configured on the UAV (102) to acquire one or more images of injured individual within a target location;
a colour sensor (114) configured on the UAV (102) to:
detect wound colouration indicative of injury severity, and
receive environmental colour data from surroundings;
at least one LED strip (116) operatively connected to the colour sensor (114), configured to adjust emitted colour to match the surrounding environmental colour based on the environmental colour data received from the colour sensor (114);
a controller (130) in communication within the image acquisition unit (112) and the colour sensor (114), and the controller (130) is operatively coupled to a memory (304), wherein the memory (304) comprises one or more processor-executable instructions which, when executed, cause the controller (130) to:
analyze the received one or more images using a pre-trained machine learning model;
classify the received one or more images into one or more categories based on type and location of injury;
determine a severity level of the injury based on the classification and the wound colouration data received from the colour sensor; and
activate a stretcher mechanism (118) attached to the UAV (102) upon determining that the severity level of the injury exceeds a predefined threshold, wherein the stretcher mechanism (118) is configured to descend from the UAV (102) to facilitate evacuation of the injured individual.
2. The system (100) as claimed in claim 1, wherein the pre-trained machine learning model is configured to identify patterns of the injury and classify the injuries into severity levels based on spatial wound features.
3. The system (100) as claimed in claim 1, wherein the controller (130) is further configured to:
analyze the wound colouration data to detect tonal variations indicative of at least one of: bleeding intensity, bruising, or oxygen deficiency;
compare the detected tonal variations with a predefined colour-severity mapping dataset stored in the memory;
compute a severity score by combining the classified injury type and the wound colouration analysis using weighted scoring; and
categorize the injured individual into critical, moderate, or minor condition, for prioritized evacuation.
4. The system (100) as claimed in claim 1, wherein the controller (130) is further configured to receive the environmental colour data from the colour sensor (114) and control the at least one LED strip (116) to dynamically adjust the emitted colour to match the detected surrounding environmental colour, thereby enable visual camouflage of the UAV (102).
5. The system (100) as claimed in claim 1, further comprising an ultrasonic sensor (122) connected to the controller (130) and configured to measure distance between the UAV (102) and ground to assist in deployment of the stretcher mechanism (118).
6. The system (100) as claimed in claim 5, wherein the stretcher mechanism is activated by a gear motor (120), and the controller (130) regulates the gear motor (120) for deployment of the stretcher mechanism (118), based on the distance measured by the ultrasonic sensor (122).
7. A method (400) implemented in an unmanned aerial vehicle (UAV) for emergency assistance and evacuation, the method comprising:
acquiring (402), by an image acquisition unit configured on the UAV, one or more images of an injured individual within a target location;
detecting (404), by a colour sensor configured on the UAV, wound colouration data indicative of severity of the injury of the injured individual;
receiving (406), by the colour sensor, environmental colour data from surroundings;
adjusting (408), by at least one LED strip operatively connected to the colour sensor, an emitted colour to match surrounding environmental colour based on the environmental colour data received;
analyzing (410), by a controller, the received one or more images using a pre-trained machine learning model stored;
classifying (412), by the controller, the received one or more images into one or more categories based on type and location of injury;
determining (414), by the controller, a severity level of the injury based on the classification and the wound colouration data received from the colour sensor; and
activating (416), by the controller, a stretcher mechanism attached to the UAV upon determining that the severity level of the injury exceeds a predefined threshold, wherein the stretcher mechanism is configured to descend from the UAV to facilitate evacuation of the injured individual.
8. The method (400) as claimed in claim 7, further comprising:
analyzing, by the controller, the wound colouration data to detect tonal variations indicative of at least one of: bleeding intensity, bruising, or oxygen deficiency;
comparing, by the controller, the detected tonal variations with a predefined colour-severity mapping dataset stored in a memory;
computing, by the controller, a severity score by combining the classified injury type and the wound colouration analysis using a weighted scoring mechanism; and
categorizing, by the controller, the injured individual into one of a critical, moderate, or minor condition for prioritized evacuation.
9. The method (400) as claimed in claim 7, further comprising:
measuring, by an ultrasonic sensor connected to the controller, a distance between the UAV and ground; and
assisting, by the controller, in deployment of the stretcher mechanism based on the measured distance.
10. The method (400) as claimed in claim 7, further comprising:
activating, by the controller, a gear motor configured to control the deployment of the stretcher mechanism; and
regulating, by the controller, operation of the gear motor based on the distance between the UAV and the ground measured by the ultrasonic sensor.

Documents

Application Documents

# Name Date
1 202541046921-STATEMENT OF UNDERTAKING (FORM 3) [15-05-2025(online)].pdf 2025-05-15
2 202541046921-REQUEST FOR EXAMINATION (FORM-18) [15-05-2025(online)].pdf 2025-05-15
3 202541046921-REQUEST FOR EARLY PUBLICATION(FORM-9) [15-05-2025(online)].pdf 2025-05-15
4 202541046921-FORM-9 [15-05-2025(online)].pdf 2025-05-15
5 202541046921-FORM FOR SMALL ENTITY(FORM-28) [15-05-2025(online)].pdf 2025-05-15
6 202541046921-FORM 18 [15-05-2025(online)].pdf 2025-05-15
7 202541046921-FORM 1 [15-05-2025(online)].pdf 2025-05-15
8 202541046921-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [15-05-2025(online)].pdf 2025-05-15
9 202541046921-EVIDENCE FOR REGISTRATION UNDER SSI [15-05-2025(online)].pdf 2025-05-15
10 202541046921-EDUCATIONAL INSTITUTION(S) [15-05-2025(online)].pdf 2025-05-15
11 202541046921-DRAWINGS [15-05-2025(online)].pdf 2025-05-15
12 202541046921-DECLARATION OF INVENTORSHIP (FORM 5) [15-05-2025(online)].pdf 2025-05-15
13 202541046921-COMPLETE SPECIFICATION [15-05-2025(online)].pdf 2025-05-15
14 202541046921-FORM-26 [13-08-2025(online)].pdf 2025-08-13
15 202541046921-Proof of Right [05-09-2025(online)].pdf 2025-09-05