Sign In to Follow Application
View All Documents & Correspondence

Autonomous Aerial Vehicle For Determining One Or More Adverse Events And Method Thereof

Abstract: ABSTRACT AUTONOMOUS AERIAL VEHICLE FOR DETERMINING ONE OR MORE ADVERSE EVENTS AND METHOD THEREOF The present invention discloses an autonomous aerial vehicle for determining one or more adverse events and method thereof. The autonomous aerial vehicle (100) comprises one or more sensors (102), a control unit (104), a docking unit (106), a docking station (108), and one or more cameras (110). The one or more sensors (102) are configured to generate sensor data to identify adverse events. The control unit (104) processes the sensor data and navigates the autonomous aerial vehicle (100) to the location of the detected one or more adverse events. The one or more cameras (110) are adapted to capture visual information at the event location. The control unit (104) communicates with the docking station (108), a communication device (124), and a cloud database (126) for alerts, data storage, and analysis. The autonomous aerial vehicle (100) enables real-time monitoring, efficient navigation, and communication with remote individuals or command centres. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 July 2023
Publication Number
05/2025
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
Parent Application

Applicants

Amerald Care Private limited
162, First Floor, Ferns Habitat, Mahadevpura ORR, Bangalore -560037, Karnataka, India

Inventors

1. Ravi Chandar Vangara
162, First Floor, Ferns Habitat, Mahadevpura ORR, Bangalore -560037, Karnataka, India
2. Vipin Vangara
162, First Floor, Ferns Habitat, Mahadevpura ORR, Bangalore -560037, Karnataka, India
3. Neeli Sridevi
162, First Floor, Ferns Habitat, Mahadevpura ORR, Bangalore -560037, Karnataka, India

Specification

Description:FIELD OF INVENTION

[0001]Embodiments of the present invention relate to aerial vehicles, and more particularly relate to an autonomous aerial vehicle for determining one or more adverse events within a monitoring area.

BACKGROUND

[0002]There is a growing need for effective monitoring and detection of adverse events within specific areas in various industries and applications. These adverse events can include fires, smoke, gas leaks, vehicular accidents and other hazardous situations requiring immediate attention and response. Timely detection and response to such events are crucial to ensure the safety of human beings, animals, and property.

[0003]Traditionally, manual monitoring and response mechanisms have been employed, which are often time-consuming, labour-intensive, and have limitations in terms of coverage and accuracy. To overcome these limitations, unmanned aerial vehicles are developed for monitoring and surveillance purposes. These vehicles are equipped with sensors, cameras, and intelligent systems to autonomously navigate through designated areas and detect potential adverse events.

[0004]Existing unmanned aerial vehicles have made significant advancements in terms of sensor integration, navigation capabilities, and data processing. However, they still have several drawbacks and limitations that hinder their effectiveness and efficiency in detecting and responding to adverse events. One of the drawbacks of existing unmanned aerial vehicles is the limited range and variety of sensors integrated into their systems. Many unmanned aerial vehicles rely on basic sensors such as cameras, GPS, and obstacle avoidance sensors. While these sensors provide valuable information, they may not be sufficient for comprehensive event detection and monitoring which limits their ability to accurately identify and assess adverse events.

[0005]Another limitation of existing unmanned aerial vehicles is the reliance on pre-defined algorithms or simple rule-based systems for data analysis and decision-making. These unmanned aerial vehicles lack sophisticated analysis capabilities to interpret sensor data and visual information effectively. As a result, there are inaccuracies in event detection and severity assessment, leading to delays or false alarms. The response capabilities of existing unmanned aerial vehicles are also limited. While these vehicles are able to detect adverse events, their ability to navigate to specific destinations corresponding to the detected events by a remote controller is inadequate. This limitation reduces their effectiveness in providing targeted interventions and timely responses.

[0006]Another drawback of existing unmanned aerial vehicles is the lack of provisions for connecting with external sensors positioned within the monitoring area. These external sensors could provide valuable additional data for accurate event detection and response. The absence of integration with external sensors restricts the overall monitoring capabilities of existing autonomous aerial vehicles. Safety considerations are crucial when operating unmanned aerial vehicles in populated areas. Existing systems may have limitations in terms of supporting the human beings involved in any vehicular accidents to help police, ambulance, and insurance companies, for the safety of human beings and navigation. These limitations compromise the safety of both the vehicle and the surrounding environment, especially in complex or crowded scenarios.

[0007]In the existing technology, an unmanned aerial vehicle (UAV) early-warning system for fire rescue is disclosed. The system consists of an unmanned aerial vehicle equipped with a rotational pan-tilt mechanism. A fire monitoring device is mounted on the rotational pan tilt, which includes a processing unit, a flame sensor, a flying control unit, a smoke sensor, and a positioning unit. The flame sensor detects infrared radiation emitted by flames within a fire zone and converts it into a far infrared signal. The processing unit analyses the intensity of the far infrared signal to determine the behaviour trend of the fire. The processing unit then activates the flying control unit to manoeuvre the UAV to hover above the fire zone. The system also utilizes the smoke sensor to measure smoke concentration and direction within the fire zone. The processing unit uses this information to determine the spreading direction of the fire. Additionally, the positioning unit is employed to determine the geographic position information of the fire zone.

[0008]However, there are some disadvantages associated with this UAV. The prior art primarily focuses on flame and smoke detection and does not mention integration with other sensors that could enhance the system's overall effectiveness. The rotational pan-tilt mechanism in the UAV relies on maintaining a direct line of sight between the UAV and the fire zone. Obstacles or terrain may obstruct the line of sight and hinder the system's effectiveness.

[0009]Similarly, a wireless charging system for unmanned aerial vehicles (UAVs) is disclosed. The system consists of an electric energy transmission coil, electric energy transmitting circuit, electric energy receiving coil, electric energy receiving circuit, camera, surface mark thing, and a ground satellite station. The electric energy transmission coil and circuit are installed at the satellite station, while the electric energy receiving coil, circuit, and camera are mounted on the UAV. The system enables wireless charging of the UAV by utilizing the electric energy transmission and receiving coils. The UAV can automatically continue its journey while being charged, increasing its flight range and overall practicality.

[0010]However, the described system focuses solely on wireless charging capabilities and does not mention autonomous navigation or response to adverse events. The wireless charging system does not mention integration with sensors for detecting adverse events such as fire, smoke, or gas. The wireless charging system does not mention communication capabilities or data analysis for real-time monitoring or transmission of sensor data.

[0011]There are various technical problems with the autonomous aerial vehicle in the prior art. In the existing technology, the autonomous aerial vehicle are having limited sensor integration. While existing autonomous aerial vehicles are able to detect adverse events, their response capabilities are limited. They lack the ability to autonomously navigate to specific destinations corresponding to detected events. Autonomous aerial vehicle restricts their ability to gather additional sensor data from on-site sensors.

[0012]Therefore, there is a need for an improved autonomous aerial vehicle that overcomes the drawbacks and limitations of existing systems. The aforementioned limitations highlight the need for an improved autonomous aerial vehicle that addresses these technical problems.

SUMMARY

[0013]This summary is provided to introduce a selection of concepts, in a simple manner, which is further described in the detailed description of the disclosure. This summary is neither intended to identify key or essential inventive concepts of the subject matter nor to determine the scope of the disclosure.

[0014]In order to overcome the above deficiencies of the prior art, the present disclosure is to solve the technical problem to provide an autonomous aerial vehicle for determining one or more adverse events within a monitoring area.

[0015]In accordance with an embodiment of the present disclosure, the autonomous aerial vehicle is disclosed. The autonomous aerial vehicle comprises one or more sensors, a control unit, a docking unit, a docking station, and one or more cameras. The autonomous aerial vehicle is adapted to autonomously detect and respond to one or more adverse events within a monitoring area.

[0016]The one or more sensors are operatively positioned in the autonomous aerial vehicle. The one or more sensors are adapted to generate sensor data for determining the one or more adverse events within the monitoring area. The one or more sensors comprises a smoke detection sensor, a fire detection sensor, a motion sensor, a proximity sensor, a Global Positioning System (GPS) sensor, an acoustic sensor, and a gas detection sensor.

[0017]The smoke detection sensor is operatively positioned at one or more positions of the autonomous aerial vehicle. The smoke detection sensor is adapted to determine the presence of smoke within the monitoring area. The smoke detection sensor is selected from a group comprising one of an ionization smoke detector, photoelectric smoke detector, gas detector, and aspirating smoke detector. The fire detection sensor is operatively positioned beside the smoke detection sensor. The fire detection sensor is adapted to determine the presence of fire within the monitoring area. The fire detection sensor is selected from a group comprising one of a flame detector, heat detector, linear heat detector, spark detector, heat-activated smoke detector, and ember detector.

[0018]The motion sensor is operatively positioned at one or more positions of the autonomous aerial vehicle. The motion sensor is adapted to determine movement in the monitoring area to assist the autonomous aerial vehicle in navigation. The motion sensor is selected from a group comprising one of an infrared motion sensor, ultrasonic motion sensor, microwave motion sensor, passive infrared (PIR) motion sensor, and video-based motion sensor.

[0019]The proximity sensor is operatively positioned near the docking unit and extreme ends of the autonomous aerial vehicle. The proximity sensor is adapted to assist in establishing a secure connection with the docking station and enable collision avoidance, obstacle detection, and navigation in the monitoring area corresponding to the detected one or more adverse events. The proximity sensor is selected from a group comprising one of an ultrasonic proximity sensor, infrared proximity sensor, capacitive proximity sensor, magnetic proximity sensor, optical proximity sensor and LIDAR (Light Detection and Ranging) sensor.

[0020]The Global Positioning System (GPS) sensor is operatively positioned inside the autonomous aerial vehicle. The Global Positioning System (GPS) sensor is adapted to allow the autonomous aerial vehicle to determine a precise location and navigate accurately within the monitoring area. The acoustic sensor is operatively positioned near the one or more cameras. The acoustic sensor is adapted to capture audio signals within the monitoring area. The acoustic sensor is selected from a group comprising one of a microphone, sound pressure sensor, spectrum analyser, directional microphone, and audio recognition sensor.

[0021]The gas detection sensor is operatively positioned beside the smoke detection sensor. The gas detection sensor is adapted to determine hazardous gases within the monitoring area. The gas detection sensor is selected from a group comprising one of a semiconductor gas sensor, electrochemical gas sensor, or infrared gas sensor, adapted to detect and measure the concentration of hazardous gases within the monitored area. The hazardous gases comprise one of a carbon monoxide (CO) gas, nitrogen oxide (NOx) gas, methane (CH4) gas, hydrogen sulphide (H2S) gas, and volatile organic compounds (VOCs) gas.

[0022]The control unit is operatively connected to the one or more sensors. The control unit is adapted to process the sensor data from the one or more sensors for identifying one or more adverse events within the monitoring area. The control unit is adapted to navigate the autonomous aerial vehicle to a destination within the monitoring area corresponding to the detected one or more adverse events. The control unit is operatively connected to a communication device and a cloud database through a communication network. The control unit is configured to send alerts to the communication device associated with remote individuals and designated recipients. The control unit is configured to store the sensor data from the one or more sensors and the visual information captured by the one or more cameras in the cloud database. The control unit is configured to communicate with the docking station for engaging and disengaging the autonomous aerial vehicle.

[0023]The cloud database is configured with an artificial intelligence (AI) engine. The artificial intelligence (AI) engine is adapted to analyse and process the sensor data and visual information stored in the cloud database for identifying the severity of one or more adverse events. The control unit is further adapted to connect with external sensors operatively positioned in the monitored area for receiving additional sensor data to trigger the autonomous aerial vehicle.

[0024]The docking unit is operatively positioned in the autonomous aerial vehicle. The docking unit is adapted to engage with a docking station. The docking unit is adapted to disengage with the docking station upon identifying the one or more adverse events by the control unit. The docking station is operatively coupled to a fixed surface within the monitoring area, adapted to hold the autonomous aerial vehicle. The docking unit and the docking station are equipped with electromagnets and a magnetic coupling mechanism.

[0025]The magnetic coupling mechanism is operatively connected to the control unit configured to enable magnetic forces between the docking unit and the docking station to hold the autonomous aerial vehicle against the docking station. The magnetic coupling mechanism is configured to disable the magnetic forces and enable autonomous aerial vehicles to swiftly respond and navigate to the destination corresponding to the detected one or more adverse events. The magnetic coupling mechanism is configured to provide electric power for charging the autonomous aerial vehicle upon establishing a connection with the docking station.

[0026]The one or more cameras are operatively connected to the control unit. The one or more cameras are adapted to capture visual information within the monitoring area upon reaching the destination corresponding to the detected one or more adverse events. The one or more cameras comprises an Infrared (IR) camera and a fish-eye camera. The Infrared (IR) camera is adapted to detect infrared radiation emitted by objects, human beings, and animals in the monitoring area corresponding to the detected one or more adverse events and capture thermal information to identify heat signatures. The fish-eye camera is adapted to capture visual information from a wide angle in the monitoring area. The one or more cameras are configured to determine one or more attributes associated with the objects, human beings, and animals. The one or more attributes comprise at least one of a height, age, gender, temperature, emotion, and physical condition.

[0027]The autonomous aerial vehicle further comprises an electroacoustic transducer. The electroacoustic transducer is adapted to produce audible alerts to notify human beings and animals about the presence of one or more adverse events within the monitoring area. The shape of an autonomous aerial vehicle is configured in a flying-bee shape. The flying-bee comprises a first end, a first side and a second side. The first end of the flying-bee is configured with the one or more cameras and the acoustic sensor. The first side and the second side of the flying-bee are configured with one or more propeller units and the proximity sensor.

[0028]In accordance with another embodiment of the present disclosure, a method for determining one or more adverse events by an autonomous aerial vehicle is disclosed. In the first step, the method includes, generating, by one or more sensors, sensor data for determining the one or more adverse events within a monitoring area. In the next step, the method includes processing, by a control unit, the sensor data from the one or more sensors for identifying one or more adverse events within the monitoring area. In the next step, the method includes communicating, by the control unit, with a docking station for disconnecting the autonomous aerial vehicle with the docking station upon identifying the one or more adverse events.

[0029]In the next step, the method includes navigating, by the control unit, the autonomous aerial vehicle to a destination within the monitoring area corresponding to the detected one or more adverse events. In the next step, the method includes capturing, by one or more cameras, visual information within the monitoring area upon reaching the destination corresponding to the detected one or more adverse events. In the next step, the method includes sending, by the control unit, the visual information and the sensor data to a communication device and a cloud database through a communication network.

[0030]To further clarify the advantages and features of the present invention, a more particular description of the invention will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the invention and are therefore not to be considered limiting in scope. The invention will be described and explained with additional specificity and detail with the appended figures.

BRIEF DESCRIPTION OF THE DRAWINGS

[0031]The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:

[0032]FIG. 1 illustrates an exemplary block diagram of an autonomous aerial vehicle, in accordance with an embodiment of the present disclosure;

[0033]FIG. 2A illustrates an exemplary top view of a flying-bee shaped autonomous aerial vehicle, in accordance with an embodiment of the present disclosure;

[0034]FIG. 2B illustrates an exemplary bottom view of a flying-bee shaped autonomous aerial vehicle, in accordance with an embodiment of the present disclosure;

[0035]FIG. 3 illustrates an exemplary schematic view of docking station attached to a fixed surface, in accordance with an embodiment of the present disclosure; and

[0036]FIG. 4 illustrates a flow chart of a method for determining one or more adverse events by an autonomous aerial vehicle, in accordance with an embodiment of the present disclosure.

[0037]Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the method steps, chemical compounds, equipment and parameters used herein may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.

DETAILED DESCRIPTION OF THE PRESENT INVENTION

[0038]For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.

[0039]The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more components, compounds, and ingredients preceded by "comprises... a" does not, without more constraints, preclude the existence of other components or compounds or ingredients or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.

[0040]Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.

[0041]In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.

[0042]Embodiments of the present disclosure relate to an autonomous aerial vehicle for determining one or more adverse events within a monitoring area.

[0043]As used herein the term “autonomous” refers to the capability of a system or vehicle to operate and make decisions without direct human intervention or control. In the context of the invention, an autonomous aerial vehicle refers to an aerial vehicle that is able to independently detect and respond to one or more adverse events within a monitoring area.

[0044]FIG. 1 illustrates an exemplary block diagram of an autonomous aerial vehicle, in accordance with an embodiment of the present disclosure.

[0045]According to an exemplary embodiment of the present disclosure, the autonomous aerial vehicle 100 comprises one or more sensors 102, a control unit 104, a docking unit 106, a docking station 108, and one or more cameras 110. The autonomous aerial vehicle 100 is adapted to autonomously detect and respond to the one or more adverse events within the monitoring area. In another exemplary embodiment, the one or more adverse events are, but not limited to, fire and explosions, smoke, gas leaks, movement, or intrusion within the monitoring area, accidents such as automobile vehicle accidents, industrial accidents, search, and rescue operations and the like.

[0046]In an exemplary embodiment, the one or more sensors 102 are operatively positioned in the autonomous aerial vehicle 100. The one or more sensors 102 are adapted to generate sensor data for determining the one or more adverse events within the monitoring area. The one or more sensors comprises, but not limited to, a smoke detection sensor 112, a fire detection sensor 114, a motion sensor 116, a proximity sensor 118, a Global Positioning System (GPS) sensor 120, an acoustic sensor 122, and a gas detection sensor 134. These one or more sensors 102 provide valuable data that enables the autonomous aerial vehicle 100 to identify different types of events and assess their severity.

[0047]In an exemplary embodiment, the smoke detection sensor 112 is strategically positioned at one or more locations on the autonomous aerial vehicle 100 to ensure effective coverage of the monitoring area. The smoke detection sensor 112 is adapted to identify the presence of smoke within the monitoring area, which is often an early indication of a fire or other hazardous conditions. The smoke detection sensor 112 is selected from a group comprising, but not limited to, one of an ionization smoke detector, photoelectric smoke detector, and aspirating smoke detector.

[0048]In an exemplary embodiment, the fire detection sensor 114 is positioned alongside the smoke detection sensor 112 and works in conjunction with it to identify the presence of fire within the monitoring area. The fire detection sensor 114 is selected from a group comprising, but not limited to, one of a flame detector, heat detector, linear heat detector, spark detector, heat-activated smoke detector, ember detector and the like. The smoke detection sensor 112 and the fire detection sensor 114 provide early warning and enable timely responses to mitigate potential fire-related hazards. Once the fire detection sensor 114 detects fire or fire-related phenomena, the control unit 104 analyses the collected data or signals to confirm the presence of the fire. The fire detection sensor 114 is employ algorithms, signal processing techniques, or pattern recognition to distinguish genuine fire events from false alarms or environmental noise.

[0049]In an exemplary embodiment, the motion sensor 116 is operatively positioned at one or more positions of the autonomous aerial vehicle 100 to effectively detect movement in the monitoring area. The motion sensor is adapted to provide input regarding the presence and patterns of motion and assist the autonomous aerial vehicle 100 navigate and respond to dynamic environmental conditions. The motion sensor 116 is selected from a group comprising, but not limited to, one of an infrared motion sensor, ultrasonic motion sensor, microwave motion sensor, passive infrared (PIR) motion sensor, video-based motion sensor and the like. By incorporating the motion sensor 116, the autonomous aerial vehicle 100 is able to perceive and respond to movement in the monitoring area. This capability enhances its navigation abilities and enables it to adapt to dynamic situations effectively. Once the motion sensor 116 detects motion or changes in the environment, the control unit 104 analyses the captured data or signals to determine the presence and characteristics of the movement. The motion sensor 116 may use various algorithms, such as pattern recognition, threshold detection, or signal processing techniques, to differentiate between desired motion and false triggers.

[0050]In an exemplary embodiment, the proximity sensor 118 is operatively positioned near the docking unit 106 and the extreme ends of the autonomous aerial vehicle 100. The proximity sensor 118 is adapted to assist in establishing a secure connection with the docking station 108. Further, the proximity sensor 118 is adapted to enable collision avoidance, obstacle detection, and navigation in the monitoring area corresponding to the detected one or more adverse events. The proximity sensor 118 is selected from a group comprising, but not limited to, one of an ultrasonic proximity sensor, infrared proximity sensor, capacitive proximity sensor, magnetic proximity sensor, optical proximity sensor and LIDAR (Light Detection and Ranging) sensor. The proximity sensor 118 is adapted to emit a signal or energy, which could be in the form of ultrasonic waves, infrared light, electromagnetic fields, or laser beams, depending on the sensor type. This emitted signal is used to detect the presence or proximity of objects. The proximity sensor 118 utilizes a sensing method to detect changes in the emitted signal caused by nearby objects. The specific sensing method differs based on the sensor type. Once the proximity sensor 118 detects a change in the emitted signal or a certain threshold is crossed, it provides an output signal or triggers a response. This output signal is used for initiating specific actions or providing information to other components of the system, such as collision avoidance, navigation, or establishing a secure connection.

[0051]In an exemplary embodiment, the Global Positioning System (GPS) sensor 120 is operatively positioned inside the autonomous aerial vehicle 100. The GPS sensor 120 is adapted to receive signals from a network of satellites orbiting the Earth. The GPS sensor 120 allows the autonomous aerial vehicle 100 to determine its precise geographic location and navigate accurately within the monitoring area. The GPS sensor receives signals from multiple GPS satellites that are constantly transmitting signals containing precise timing and positioning information. Using the distances obtained from multiple satellites, the GPS sensor 120 applies trilateration algorithms to calculate the precise position of the autonomous aerial vehicle 100. Once the GPS sensor 120 determines the precise location of the autonomous aerial vehicle 100, it can use this information for navigation and tracking purposes. The GPS sensor 120 continuously receives signals from multiple satellites and updates the position information of the autonomous aerial vehicle 100 in real-time. This allows the autonomous aerial vehicle 100 to maintain accurate navigation even during movement or changes in the monitoring area.

[0052]In an exemplary embodiment, the acoustic sensor 122 is operatively positioned near the one or more cameras 110. The acoustic sensor 122 is adapted to capture audio signals within the monitoring area. The acoustic sensor 122 is positioned near the one or more cameras 110 to complement the visual data with audio information. The acoustic sensor 122 is selected from a group comprising one of a microphone, sound pressure sensor, spectrum analyser, directional microphone, and audio recognition sensor.

[0053]The acoustic sensor 122, which may be a microphone or any other suitable audio-capturing device, captures sound waves in its vicinity. When sound waves reach the acoustic sensor 122, they cause vibrations or variations in air pressure, which are then converted into electrical signals. The acoustic sensor 122 converts the analogue audio signals into digital format for further processing and analysis. This conversion may involve analogue-to-digital conversion techniques to represent the captured sound as a digital signal. The digital audio signal is processed by the control unit 104. This processing may involve various techniques, such as filtering, amplification, noise reduction, or equalization, to enhance the quality and extract relevant information from the captured audio. The processed audio signal is further analysed to extract meaningful data. This analysis may involve techniques such as spectral analysis, pattern recognition, or audio classification algorithms.

[0054]By analysing the audio, the acoustic sensor 122 is able to identify specific sounds or patterns of interest within the monitoring area. The audio data captured by the acoustic sensor 122 is correlated with the visual data obtained from the one or more cameras 110 positioned nearby. This integration allows for a more comprehensive understanding of the monitored area, as audio cues is providing additional context and insights alongside visual information. The captured audio signals are utilized in various ways within the autonomous aerial vehicle 100. For instance, they may be used for event detection, such as detecting specific sounds related to adverse events or anomalies. Additionally, audio data is contributed to situational awareness of the human beings, helping to identify and assess potential risks or hazards in the monitoring area. This audio data from the human responders enables the rescue teams to make informed decisions and take appropriate actions based on both visual and audio inputs.

[0055]In an exemplary embodiment, the gas detection sensor 134 is operatively positioned beside the smoke detection sensor 112. The gas detection sensor 134 is adapted to determine hazardous gases within the monitoring area. The gas detection sensor 134 is selected from a group comprising one of a, but not limited to, semiconductor gas sensor, electrochemical gas sensor, infrared gas sensor and the like. The hazardous gases comprise, but not limited to, one of a carbon monoxide (CO) gas, nitrogen oxide (NOx) gas, methane (CH4) gas, hydrogen sulphide (H2S) gas, and volatile organic compounds (VOCs) gas. These gases are commonly associated with fire, pollution, industrial processes, or other potentially dangerous situations.

[0056]Once the gas detection sensor 134 identifies the presence of a hazardous gas, it measures the concentration of that gas in the surrounding environment. The concentration is typically represented in parts per million (ppm) or percentage (%), indicating the relative amount of the gas present. When the gas detection sensor detects hazardous gas concentrations above a predefined threshold, it can trigger an alarm or alert system. This alert can be transmitted to a command centre, and human operators, integrated with an overall monitoring system, or used to activate specific response actions. The autonomous aerial vehicle 100 is able to take appropriate measures such as adjusting its flight path, notifying emergency services and the command centre, or providing real-time information to responders on the ground.

[0057]In an exemplary embodiment, the control unit 104 is operatively connected to the one or more sensors 102 and responsible for processing the sensor data received from these one or more sensors 102. The control unit 104 analyses the data to identify and determine the presence of one or more adverse events within the monitoring area. The specific algorithms and logic implemented in the control unit 104 allow for accurate event detection based on the information provided by the one or more sensors 102. Once an adverse event is detected, the control unit 104 takes charge of navigating the autonomous aerial vehicle 100 to a destination within the monitoring area that corresponds to the detected adverse event. The control unit 104 uses the sensor data and other relevant information to determine the appropriate destination for the autonomous aerial vehicle 100. This navigation capability enables the autonomous aerial vehicle 100 to quickly respond to adverse events and reach the designated areas for further investigation or action.

[0058]The control unit 104 is connected to a communication device 124 and a cloud database 126 through a communication network 128. The control unit 104 with the communication network 128 is adapted to send alerts and notifications to remote individuals and designated recipients. These alerts provide real-time information about the detected adverse events, their severity, and any necessary actions to be taken. The communication device 124 is selected from a group comprising, but not limited to, a smartphone, tablet, computer, or any other suitable device capable of receiving and displaying alerts. The communication network 128 comprises, but not limited to, an open area network, a personal area network (PAN), a local area network (LAN), a wireless local area network (WLAN), a wide area network (WAN), a virtual private network (VPN), campus area network (CAN), local interconnect network (LIN), wireless fidelity (Wi-Fi), Ethernet, Bluetooth, Near Field Communication (NFC), global system for mobile communications (GSM) and the like.

[0059]The “remote individuals” refer to people who are located at a distance from the monitoring area or the autonomous aerial vehicle 100. These individuals may include, but not limited to, emergency responders, command centre authorities, security personnel, or any other relevant parties who need to be notified about the detected adverse events. The "designated recipients" are specific individuals or entities who have been predetermined or assigned to receive alerts and notifications regarding the detected the one or more adverse events. These recipients could be, but not limited to, supervisors, managers, decision-makers, family members, police, emergency, ambulance, insurance, or any other individuals who need to be informed about the ongoing situation in real-time. The designated recipients may have direct responsibility or authority to take necessary actions based on the received information. The purpose of notifying both remote individuals and designated recipients is to ensure that the relevant stakeholders are promptly informed about the one or more adverse events detected by the autonomous aerial vehicle 100. This allows for quick response, coordinated actions, and effective management of the situation.

[0060]In an exemplary embodiment, the autonomous aerial vehicle 100 includes a docking unit 106 that is positioned within the autonomous aerial vehicle 100 itself. So, the docking unit 106 is referred to as autonomous aerial vehicle (AAV) docking unit 106 or AAV docking unit 106. This docking unit 106 is designed to engage with the docking station 108. The docking station 108 is operatively coupled to a fixed surface within the monitoring area. The purpose of the docking unit 106 and the docking station 108 is to securely hold the autonomous aerial vehicle 100 when it is not in ideal condition or during charging.

[0061]In ideal conditions or during charging, the autonomous aerial vehicle 100 operates in a monitoring state. This means that the one or more sensors 102 installed on the autonomous aerial vehicle 100 are actively monitoring the designated monitoring area to detect and determine any potential one or more adverse events. While the autonomous aerial vehicle 100 is in the monitoring state, the one or more sensors 102 continuously generate sensor data by observing their surroundings. The sensor data includes information about various environmental factors such as smoke, fire, motion, proximity, gas concentration, and other relevant parameters depending on the specific sensors installed. The control unit 104 of the autonomous aerial vehicle 100 receives and processes the sensor data from the one or more sensors 102. The control unit 104 analyses the data to identify and assess any potential adverse events occurring within the monitoring area. During the charging process, the autonomous aerial vehicle 100 remains vigilant and continues its monitoring activities to ensure the safety and security of the monitored area. The charging process does not interfere with the ability to detect and respond to adverse events of the autonomous aerial vehicle 100, as the one or more sensors 102 remain operational and actively contribute to the monitoring efforts.

[0062]The docking unit 106 and the docking station 108 are equipped with electromagnets and a magnetic coupling mechanism. The magnetic coupling mechanism is connected to the control unit 104 of the autonomous aerial vehicle 100. The control unit 104 is configured to enable the generation of magnetic forces between the docking unit 106 and the docking station 108, which holds the autonomous aerial vehicle 100 firmly in place against the docking station 108. This ensures a stable connection between the autonomous aerial vehicle 100 and the docking station 108. However, when the control unit 106 identifies the one or more adverse events within the monitoring area, it disengages the docking unit 106 from the docking station 108. This is achieved by disabling the magnetic forces generated by the magnetic coupling mechanism. By virtue of the automatic release of the magnetic connection, the autonomous aerial vehicle 100 is able to swiftly respond and navigate to the destination corresponding to the detected one or more adverse events.

[0063]Additionally, when the autonomous aerial vehicle 100 is connected to the docking station 108, the magnetic coupling mechanism also serves the purpose of providing electric power for charging the autonomous aerial vehicle 100. The connection between the docking unit 106 and the docking station 108 enables the transfer of electric power, ensuring that batteries in the autonomous aerial vehicle 100 are charged and ready for operation.

[0064]Herein one exemplary embodiment, the docking station 108 is secured in an enclosure. The enclosure is adapted to safeguard the docking station 108 and the autonomous aerial vehicle 100 from external elements such as weather conditions (e.g., rain, snow, and extreme temperatures), dust, debris, and physical impacts. The enclosure is constructed using durable materials that ensure structural integrity and longevity. The enclosure is configured with features such as a sealed design, reinforced walls, and a secure locking mechanism that may be controlled by a control unit 104 to prevent unauthorized access. Additionally, the enclosure may include ventilation systems or cooling mechanisms to regulate the temperature inside and prevent overheating of the docking station 108 while charging. The enclosure provides a safe and controlled environment for the docking station 108 and the autonomous aerial vehicle 100 and ensuring optimal functioning and protection against potential damage or interference.

[0065]In an exemplary embodiment, the autonomous aerial vehicle 100 is equipped with the one or more cameras 110 that are connected to the control unit 104. These one or more cameras 110 serves the purpose of capturing visual information within the monitoring area once the autonomous aerial vehicle 100 reaches the destination corresponding to the detected the one or more adverse events. The one or more cameras 110 included in this embodiment are an Infrared (IR) camera 130 and a fish-eye camera 132.

[0066]The Infrared (IR) camera 130 is adapted to detect infrared radiation emitted by objects, human beings, and animals in the monitoring area. By capturing thermal information, the IR camera 130 is able to identify heat signatures, which are indicative of potential hazards or anomalies related to the detected adverse events. The fish-eye camera 132, on the other hand, is capable of capturing visual information from a wide angle in the monitoring area. This wide-angle view allows for broader coverage and a more comprehensive understanding of the situation at the destination. The wide-angle range in between, but not limited to, 180° and 360°. This wide-angle coverage allows the one or more cameras 110 to capture the broad perspective of the surroundings, minimizing blind spots and providing a comprehensive view of the monitoring area.

[0067]The one or more cameras 110 are configured to determine various attributes associated with the objects, human beings along with their Thermal Image consisting of their temperature, and animals present in the monitoring area. These attributes may include height, age, gender, temperature, emotion, and physical condition. By analysing the visual information captured by the cameras, the control unit 104 is able to extract and assess these attributes, providing additional contextual information about the situation.

[0068]The control unit 104 is responsible for storing the sensor data from the one or more sensors 102 and the visual information captured by the one or more cameras 110 in the cloud database 126. This storage allows for easy access and retrieval of the data for further analysis, historical reference, or collaboration with external stakeholders. Storing the data in the cloud database 126 ensures its availability and reliability and facilitates seamless integration with other systems or applications. The cloud database 126 is configured with an artificial intelligence (AI) engine 136. The control unit 104 leverages this AI engine 136 to analyse and process the stored sensor data and visual information. The AI engine 136 applies advanced algorithms and machine learning techniques to extract meaningful insights, identify patterns, and assess the severity of the detected adverse events. This AI-enabled analysis enhances the decision-making capabilities of the autonomous aerial vehicle 100 and enables more efficient and effective responses to the identified one or more adverse events.

[0069]The control unit 104 is able to connect with external sensors that are operatively positioned in the monitored area. This integration allows the autonomous aerial vehicle 100 to receive additional sensor data from these external sensors. The control unit 104 is able to use this data to trigger specific actions or enhance its understanding of the situation. By collaborating with external sensors, the autonomous aerial vehicle 100 is able to gather comprehensive information about the one or more adverse events and make more informed decisions.

[0070]The control unit 104 also facilitates communication with the docking station 108 for engaging and disengaging the autonomous aerial vehicle 100. This interaction ensures seamless docking of the autonomous aerial vehicle 100 for charging, maintenance, or other purposes. The control unit 104 coordinates the docking process and ensures a secure and efficient connection between the autonomous aerial vehicle 100 and the docking station 108.

[0071]The autonomous aerial vehicle 100 further comprises an electroacoustic transducer. The electroacoustic transducer in the autonomous aerial vehicle 100 serves the purpose of producing audible alerts to notify human beings and animals about the presence of one or more adverse events within the monitoring area. This electroacoustic transducer is designed to convert electrical signals into sound waves, allowing it to generate audible alerts or alarm signals. When the control unit 104 of the autonomous aerial vehicle 100 detects the occurrence of the one or more adverse events, such as fires, smoke, or other hazards, it triggers the electroacoustic transducer to produce sound signals. These audible alerts serve as a warning mechanism, notifying human beings in the vicinity about the presence of the adverse events and prompting them to take necessary actions or precautions. The specific characteristics of the audible alerts, such as the sound pattern, intensity, and duration, are designed to effectively capture the attention of human beings and animals in the monitoring area.

[0072]The electroacoustic transducer is adapted to communicate between human beings and remote individuals or command centres. By incorporating the electroacoustic transducer, remote individuals or command centres are able to send instructions or audio signals to the autonomous aerial vehicle 100, allowing for real-time communication and coordination during emergencies or operational tasks.

[0073]In an exemplary embodiment, the autonomous aerial vehicle 100 is made of carbon fibre. Carbon fibre is known for its exceptional strength-to-weight ratio. It provides high tensile strength, stiffness, and resistance to impacts, making the autonomous aerial vehicle 100 robust and capable of withstanding rugged terrain, extreme temperatures, and harsh weather conditions. The carbon fibre is significantly lighter than traditional materials such as aluminium or steel. The lightweight nature of the material allows for increased payload capacity, extended flight times, and improved manoeuvrability of the autonomous aerial vehicle 100. It also reduces energy consumption and enhances overall efficiency.

[0074]The autonomous aerial vehicle 100 with carbon fibre is highly resistant to corrosion, making it ideal for applications in environments with moisture, rain, or snow. It does not rust or degrade when exposed to water, preventing structural damage, and ensuring the longevity of autonomous aerial vehicle 100. The carbon fibre allows for versatile and complex designs, enabling aerodynamic shapes and optimized structures for the autonomous aerial vehicle 100. This flexibility in design enhances flight performance, stability, and manoeuvrability, especially in challenging terrain or adverse weather conditions.

[0075]FIG. 2A illustrates an exemplary top view of a flying-bee 200 shaped autonomous aerial vehicle 100, in accordance with an embodiment of the present disclosure.

[0076]FIG. 2B illustrates an exemplary bottom view of a flying-bee 200 shaped autonomous aerial vehicle 100, in accordance with an embodiment of the present disclosure.

[0077]According to another exemplary embodiment of the present disclosure, the shape of the autonomous aerial vehicle 100 is configured in a flying-bee 200 shape, which consists of a first end 202, a first side 204, and a second side 206. At the first end 202 of the flying-bee 200 is equipped with one or more cameras 110 in place of a pair of eyes of the flying- bee 200 and the acoustic sensor. The one or more cameras 110 are responsible for capturing visual information within the monitoring area. The first end 202 is configured with the acoustic sensor 122 is used to capture audio signals. This configuration allows the autonomous aerial vehicle 100 to gather both visual and auditory data related to the monitored environment. Further, the first end 202, in between the one or more cameras 110, the IR camera 130 is positioned to detect infrared radiation emitted by objects, human beings, and animals in the monitoring area corresponding to the detected one or more adverse events and capture thermal information to identify heat signatures.

[0078]The first end 202 is configured with the electroacoustic transducer 212. The electroacoustic transducer 212 serves as an interaction device between human beings and remote individuals or command centres. It enables bidirectional communication, allowing remote individuals or command centres to send instructions or audio signals to the autonomous aerial vehicle 100. This feature enhances the capabilities of the autonomous aerial vehicle 100 by facilitating real-time communication and coordination during emergencies or operational tasks.

[0079]Additionally, the flying-bee 200 comprises a plurality of antennas 210. The plurality of antennas 210 is designed to establish a secure and reliable connection with the communication network 128 and the cloud database 126. They enable the autonomous aerial vehicle 100 to access and exchange data with external systems, such as remote monitoring stations or centralized databases. This connectivity ensures that the autonomous aerial vehicle 100 is able to transmit and receive information efficiently, enhancing its functionality and enabling seamless integration into the existing communication infrastructure.

[0080]In an exemplary embodiment, the first side 204 and the second side 206 of the flying-bee 200, are configured with one or more propeller units 208 and the proximity sensor 118. The one or more propeller units 208 are adapted to generate the necessary lift and propulsion to enable the autonomous flight of the autonomous aerial vehicle 100. The proximity sensor 118, on the other hand, assists in establishing the secure connection with the docking station 108 and enables collision avoidance, obstacle detection, and navigation in the monitoring area.

[0081]FIG. 3 illustrates an exemplary schematic view 300 of docking station 108 attached to the fixed surface 302, in accordance with an embodiment of the present disclosure.

[0082]According to another exemplary embodiment of the present disclosure, the docking station 108 is attached to the fixed surface 302. The fixed surface 302 for installing the docking station 108 refers to a stable and secure location within the monitoring area where the docking station 108 is permanently installed. This fixed surface 302 could be a building rooftop, a platform, a pole, or any other suitable structure that can provide a stable base for the docking station 108. The docking station 108 is designed to securely hold and support the autonomous aerial vehicle 100 when it is not actively navigating or responding to adverse events. The docking station 108 serves as a designated monitoring position and charging point for the autonomous aerial vehicle 100. The fixed surface 302 needs to be chosen strategically to ensure optimal coverage of the monitoring area and convenient access for the autonomous aerial vehicle 100. It should be located in a position that allows the autonomous aerial vehicle 100 to reach different parts of the monitoring area quickly and easily when needed.

[0083]When the autonomous aerial vehicle 100 returns to the monitoring area and requires charging or rest, it aligns itself with the docking station 108 and engages with it. The docking unit 106 of the autonomous aerial vehicle 100 and the docking station 108 are equipped with electromagnets and the magnetic coupling mechanism, which enable the secure connection between the docking unit 106 of the autonomous aerial vehicle 100 and the docking station 108.

[0084]The installation of the docking station 108 on the fixed surface 302 requires careful planning and adherence to safety guidelines. The docking station 108 is operatively coupled to the fixed surface 302 by an appropriate mounting hardware. The mounting hardware may comprise, but not limited to, brackets, anchors, screws, adhesive mounts, mounting plates, boxes, or other fasteners. Herein one embodiment, an L-shaped bracket 304 is used where a pair of docking station 108 is installed on top and bottom of the L-shaped bracket 304. This facilitates the charging of two autonomous aerial vehicles 100 at a glance.

[0085]FIG. 4 illustrates a flow chart of a method 400 for determining one or more adverse events by an autonomous aerial vehicle, in accordance with an embodiment of the present disclosure.

[0086]According to another exemplary embodiment of the present disclosure, at step 402, the method 400 includes the one or more sensors within the autonomous aerial vehicle to generate sensor data related to the monitoring area for determining the one or more adverse events. In another embodiment, the sensor data is generated by the external sensors to trigger the autonomous aerial vehicle. These one or more sensors and the external sensors capture information such as smoke detection, fire detection, motion detection, proximity detection, GPS location, acoustic signals, and gas detection, among others.

[0087]At step 404, the method 400 includes the control unit of the autonomous aerial vehicle receiving the sensor data and processing it to identify the one or more adverse events within the monitoring area. The control unit analyses the sensor data to detect and categorize events such as fires, smoke, movement, proximity obstacles, hazardous gases and the like. At step 406, the method 400 includes identifying the one or more adverse events, and the control unit establishes communication with the docking station. This communication allows the autonomous aerial vehicle to disconnect itself from the docking station, indicating its readiness to respond to the detected events.

[0088]At step 408, the method 400 includes the control unit determining the destination within the monitoring area corresponding to the detected adverse events. It autonomously navigates the autonomous aerial vehicle to this destination, using its flight capabilities and navigation systems. At step 410, the method 400 includes that upon reaching the destination, one or more cameras installed on the autonomous aerial vehicle capture visual information within the monitoring area. The one or more cameras include infrared cameras for detecting heat signatures and fish-eye cameras for wide-angle views. Further, the control unit sends both the captured visual information and the sensor data to a communication device, such as a remote control station or a mobile device, as well as to the cloud database. This data transmission occurs over the communication network, ensuring that the information reaches designated recipients and is securely stored for further analysis or retrieval.

[0089]Numerous advantages of the present disclosure may be apparent from the discussion above. In accordance with the present disclosure, the autonomous aerial vehicle for determining one or more adverse events within a monitoring area. The use of one or more sensors, such as smoke detectors, fire detectors, motion sensors, and gas detectors, allows the autonomous aerial vehicle to detect adverse events promptly. This enables timely response and mitigation measures, reducing potential damages.

[0090]The control unit of the autonomous aerial vehicle processes sensor data and navigates the autonomous aerial vehicle to the destination corresponding to the detected adverse events. This ensures a swift and targeted response, optimizing the efficiency of emergency operations or surveillance tasks. The inclusion of cameras, including infrared and fish-eye cameras, enables the capture of visual information within the monitoring area. This visual data enhances situational awareness, aiding in decision-making and assessment of the severity of adverse events.

[0091]The electroacoustic transducer facilitates communication between the autonomous aerial vehicle and remote individuals or command centres. This real-time communication enables the exchange of instructions, audio signals, and coordination during emergencies or operational tasks. The control unit sends the sensor data and visual information to a cloud database. This allows for centralized storage, analysis, and retrieval of the collected data. The integration of an artificial intelligence (AI) engine in the cloud database further enhances the analysis capabilities, aiding in identifying the severity of adverse events.

[0092]The autonomous aerial vehicle can be deployed in various monitoring areas, adapting to different environments and scenarios. Its ability to navigate autonomously, detect multiple types of adverse events, and capture visual information makes it a versatile tool for applications such as emergency response, surveillance, and environmental monitoring.

[0093]For instance, deploying the autonomous aerial vehicle at border security is able to offer significant advantages in enhancing border surveillance and reducing the need for a large number of border personnel. The borders are equipped with external sensors such as perimeter Infrared (IR)/ Radio Frequency (RF) detection systems that are able to sense intrusions or breaches along the border. These systems are able to detect movements or disturbances within the monitored area. When an intrusion is detected by the perimeter IR/RF detection system, it triggers the release of the autonomous aerial vehicle from a nearby tower or base station. This release is automated based on predefined parameters or manually activated by border security personnel.

[0094]The autonomous aerial vehicle, equipped with IR and night vision cameras, is deployed to the location of the detected intrusion. It can quickly reach the scene, providing a bird's-eye view and capturing detailed visual information. The captured pictures and video footage from the IR and night vision cameras are transmitted in real-time to the border security command centre or designated personnel. This enables an instant visual assessment of the intruder's location, movement, and any accompanying activities. By utilizing autonomous aerial vehicles, the need for a large number of border personnel to physically man the entire border is reduced. The aerial vehicles can cover large areas efficiently and effectively, providing enhanced surveillance capabilities.

[0095]The autonomous aerial vehicle's swift response to intrusions allows for timely intervention and appropriate deployment of border security resources. It enhances the overall border security posture by providing real-time situational awareness and enabling a faster and more targeted response to potential threats. Deploying autonomous aerial vehicles for border surveillance may optimize costs and resources associated with border security operations. It allows for the efficient allocation of personnel, reducing the need for constant physical presence along the entire border while maintaining effective surveillance capabilities.

[0096]For instance, autonomous aerial vehicles may patrol and monitor the large premises of high-security industrial and defence establishments. Equipped with sensors such as cameras, motion detectors, and thermal imaging technology, it is able to detect and identify any unauthorized individuals or suspicious activities across various locations within the premises. When an intrusion or suspicious activity is detected, the autonomous aerial vehicle can swiftly respond to the location. Its real-time monitoring capabilities enable security personnel to assess the situation promptly and take appropriate actions to address any potential threats. The autonomous aerial vehicle is able to cover large areas and provide a comprehensive view of the premises from an elevated perspective. This enables more precise intrusion detection as it can monitor areas that may be challenging to access by ground-based security systems. It can also identify the exact location of the intruder, allowing security personnel to respond effectively. The presence of an autonomous aerial vehicle itself acts as a deterrent for potential intruders. Knowing that the premises are under constant aerial surveillance can discourage unauthorized individuals from attempting to breach security and help maintain a secure environment. The autonomous aerial vehicle may be integrated with existing security systems, such as access control systems, perimeter alarms, and video surveillance networks. This integration enables seamless information sharing and enhances the overall security infrastructure of the establishment.

[0097]Similarly, deploying autonomous aerial vehicles in automotive vehicles can provide valuable assistance during Vehicular accidents and enhance safety and communication capabilities. When an impact or accident occurs, the autonomous aerial vehicle is released from the docking station upon being triggered by the external sensors such as impact sensors. The autonomous aerial vehicle is able to quickly survey the accident site and capture still images and video footage of the automotive vehicle and its surroundings. This allows for a rapid assessment of the accident scene, providing crucial information to traffic police, ambulance and insurance claims, emergency response, and accident investigations.

[0098]The aerial vehicle can capture detailed visual evidence of the accident, including the position of vehicles involved, damages, and road conditions. This documentation can serve as valuable evidence for insurance claims, ensuring a fair and accurate assessment of the incident and expediting the claims process. By sending sensor data to designated emergency contacts, such as emergency services or pre-programmed kith and kin, the autonomous aerial vehicle provides real-time visual information about the accident. This aids emergency responders in understanding the severity of the situation and making informed decisions regarding the required resources and assistance.

[0099]The autonomous aerial vehicle can establish a communication link between the occupants of the automotive vehicle and designated contacts. This allows passengers or drivers to communicate their status, location, and any additional information about the accident, even if they are unable to use traditional communication methods. To ensure safety-related redundancy, the autonomous aerial vehicle may be mounted in multiple locations on the automotive vehicle, such as on the top or bottom, in a secured container to ensure environmental, reliability and safety requirements. This redundancy ensures that the automotive vehicle is still deployed as the autonomous aerial vehicle even if one mounting option is obstructed or damaged. It's crucial to ensure that privacy concerns and data security are addressed when deploying autonomous aerial vehicles in automotive vehicles. Robust data encryption, secure communication protocols, and strict access controls are implemented to protect sensitive information captured by the aerial vehicle. The stored sensor data is assigned a unique identification number, which is assessed by the authorised persons. The autonomous aerial vehicle may also be integrated with the existing safety systems of the automotive vehicle, such as airbags and crash detection sensors. This integration ensures that the deployment of the autonomous aerial vehicle is synchronized with the activation of these safety mechanisms, further enhancing the overall safety and response capabilities of the automotive vehicle.

[0100]While specific language has been used to describe the invention, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.

[0101]The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, order of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts need to be necessarily performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples.

, Claims:I /We Claim:
1. An autonomous aerial vehicle (100), comprising:
one or more sensors (102) operatively positioned in the autonomous aerial vehicle (100), adapted to generate sensor data for determining one or more adverse events within a monitoring area;
a control unit (104) operatively connected to the one or more sensors (102), adapted to process the sensor data from the one or more sensors (102) for identifying one or more adverse events within the monitoring area,
the control unit (104) adapted to navigate the autonomous aerial vehicle (100) to a destination within the monitoring area corresponding to the detected one or more adverse events;
a docking unit (106) operatively positioned in the autonomous aerial vehicle (100), adapted to engage with a docking station (108),
the docking unit (106) adapted to disengage with the docking station (108) upon identifying the one or more adverse events by the control unit (104);
the docking station (108) operatively coupled to a fixed surface (302) within the monitoring area, adapted to hold the autonomous aerial vehicle (100); and
one or more cameras (110) operatively connected to the control unit (104), adapted to capture visual information within the monitoring area upon reaching the destination corresponding to the detected one or more adverse events,
whereby the autonomous aerial vehicle (100) adapted to autonomously detect and respond to the one or more adverse events within the monitoring area.
2. The autonomous aerial vehicle (100) as claimed in claim 1, wherein the one or more sensors (102) comprises
a smoke detection sensor (112) operatively positioned at one or more positions of the autonomous aerial vehicle (100), adapted to determine presence of smoke within the monitoring area;
a fire detection sensor (114) operatively positioned beside the smoke detection sensor (112), adapted to determine presence of fire within the monitoring area;
a motion sensor (116) operatively positioned at one or more positions of the autonomous aerial vehicle (100), adapted to determine movement in the monitoring area to assist the autonomous aerial vehicle (100) in navigation;
a proximity sensor (118) operatively positioned near the docking unit (106) and extreme ends of the autonomous aerial vehicle (100), adapted to assist in establishing a secure connection with the docking station (108) and enable collision avoidance, obstacle detection, and navigation in the monitoring area corresponding to the detected one or more adverse events;
a Global Positioning System (GPS) sensor (120) operatively positioned inside the autonomous aerial vehicle (100), adapted to allow the autonomous aerial vehicle (100) to determine a precise location and navigate accurately within the monitoring area;
an acoustic sensor (122) operatively positioned near the one or more cameras (110), adapted to capture audio signals within the monitoring area; and
a gas detection sensor (134) operatively positioned beside the smoke detection sensor (112), adapted to determine hazardous gases within the monitoring area.
3. The autonomous aerial vehicle (100) as claimed in claim 2, wherein the smoke detection sensor (112) is selected from a group comprising one of an ionization smoke detector, photoelectric smoke detector, and aspirating smoke detector.
4. The autonomous aerial vehicle (100) as claimed in claim 2, wherein the fire detection sensor (114) is selected from a group comprising one of a flame detector, heat detector, linear heat detector, spark detector, heat-activated smoke detector, and ember detector.
5. The autonomous aerial vehicle (100) as claimed in claim 2, wherein the motion sensor (116) is selected from a group comprising one of an infrared motion sensor, ultrasonic motion sensor, microwave motion sensor, passive infrared (PIR) motion sensor, and video-based motion sensor.
6. The autonomous aerial vehicle (100) as claimed in claim 2, wherein the proximity sensor (118) is selected from a group comprising one of an ultrasonic proximity sensor, infrared proximity sensor, capacitive proximity sensor, magnetic proximity sensor, optical proximity sensor and LIDAR (Light Detection and Ranging) sensor.
7. The autonomous aerial vehicle (100) as claimed in claim 2, wherein the acoustic sensor (122) is selected from a group comprising one of a microphone, sound pressure sensor, spectrum analyser, directional microphone, and audio recognition sensor.
8. The autonomous aerial vehicle (100) as claimed in claim 2, wherein the gas detection sensor (134) is selected from a group comprising one of a semiconductor gas sensor, electrochemical gas sensor, or infrared gas sensor, adapted to detect and measure the concentration of hazardous gases within the monitored area,
the hazardous gases comprise one of a carbon monoxide (CO) gas, nitrogen oxide (NOx) gas, methane (CH4) gas, hydrogen sulphide (H2S) gas, and volatile organic compounds (VOCs) gas.
9. The autonomous aerial vehicle (100) as claimed in claim 1, wherein the control unit (104) operatively connected to a communication device (124) and a cloud database (126) through a communication network (128),
the control unit (104) is configured to
send alerts to the communication device (124) associated with remote individuals and designated recipients;
store the sensor data from the one or more sensors (102) and the visual information captured by the one or more cameras (110) in the cloud database (126);
communicate with the docking station (108) for engaging and disengaging the autonomous aerial vehicle (100).
10. The autonomous aerial vehicle (100) as claimed in claim 9, wherein the cloud database (126) is configured with an artificial intelligence (AI) engine (136),
the artificial intelligence (AI) engine (136) is adapted to analyse and process the sensor data and visual information stored in the cloud database (126) for identifying a severity of one or more adverse events.
11. The autonomous aerial vehicle (100) as claimed in claim 1, wherein the control unit (104) is further adapted to connect with external sensors operatively positioned in the monitored area for receiving additional sensor data to trigger the autonomous aerial vehicle (100).
12. The autonomous aerial vehicle (100) as claimed in claim 1, wherein the one or more cameras (110) comprises
an Infrared (IR) camera (130) adapted to detect infrared radiation emitted by objects, human beings, and animals in the monitoring area corresponding to the detected one or more adverse events and capture thermal information to identify heat signatures; and
a fish-eye camera (132) adapted to capture visual information from a wide angle in the monitoring area.
13. The autonomous aerial vehicle (100) as claimed in claim 1, wherein the one or more cameras (110) configured to determine one or more attributes associated with the objects, human beings, and animals,
the one or more attributes comprises at least one of a height, age, gender, temperature, emotion, and physical condition.
14. The autonomous aerial vehicle (100) as claimed in claim 1, wherein the docking unit (106) and the docking station (108) are equipped with electromagnets and a magnetic coupling mechanism,
the magnetic coupling mechanism is operatively connected to the control unit (104), configured to
enables magnetic forces between the docking unit (106) and the docking station (108) to hold the autonomous aerial vehicle (100) against the docking station (108);
disable the magnetic forces and enable autonomous aerial vehicle (100) to swiftly respond and navigate to the destination corresponding to the detected one or more adverse events; and
provide electric power for charging the autonomous aerial vehicle (100) upon establishing a connection with the docking station (108).
15. The autonomous aerial vehicle (100) as claimed in claim 1, further comprises an electroacoustic transducer (212),
the electroacoustic transducer (212) is adapted to produce audible alerts to notify human beings and animals about the presence of one or more adverse events within the monitoring area.
16. The autonomous aerial vehicle (100) as claimed in claim 1, wherein the shape of the autonomous aerial vehicle (100) is configured in a flying-bee (200) shape,
the flying-bee (200) comprises a first end (202), a first side (204) and a second side(206),
the first end (202) of the flying-bee (200) configured with the one or more cameras (110), and the acoustic sensor (122); and
the first side (204) and the second side (206) of the flying-bee (200) configured with one or more propeller units (208), and the proximity sensor (118).
17. A method for determining one or more adverse events by an autonomous aerial vehicle, comprising:
generating, by one or more sensors (102), sensor data for determining the one or more adverse events within a monitoring area;
processing, by a control unit (104), the sensor data from the one or more sensors (102) for identifying one or more adverse events within the monitoring area;
communicating, by the control unit (104), with a docking station (108) for disconnecting the autonomous aerial vehicle (100) with the docking station (108) upon identifying the one or more adverse events;
navigating, by the control unit (104), the autonomous aerial vehicle (100) to a destination within the monitoring area corresponding to the detected one or more adverse events; and
capturing, by one or more cameras (110), visual information within the monitoring area upon reaching the destination corresponding to the detected one or more adverse events.
18. The method as claimed in claim 17, further comprising sending, by the control unit (104), the visual information and the sensor data to a communication device (124) and a cloud database (126) through a communication network (128).

Dated this 10th day of July, 2023

Vidya Bhaskar Singh Nandiyal
Patent Agent No.: IN/PA-2912
IPExcel Services Private Limited
AGENT FOR APPLICANTS

Documents

Application Documents

# Name Date
1 202341046543-STATEMENT OF UNDERTAKING (FORM 3) [11-07-2023(online)].pdf 2023-07-11
2 202341046543-FORM FOR STARTUP [11-07-2023(online)].pdf 2023-07-11
3 202341046543-FORM FOR SMALL ENTITY(FORM-28) [11-07-2023(online)].pdf 2023-07-11
4 202341046543-FORM 1 [11-07-2023(online)].pdf 2023-07-11
5 202341046543-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [11-07-2023(online)].pdf 2023-07-11
6 202341046543-EVIDENCE FOR REGISTRATION UNDER SSI [11-07-2023(online)].pdf 2023-07-11
7 202341046543-DRAWINGS [11-07-2023(online)].pdf 2023-07-11
8 202341046543-DECLARATION OF INVENTORSHIP (FORM 5) [11-07-2023(online)].pdf 2023-07-11
9 202341046543-COMPLETE SPECIFICATION [11-07-2023(online)].pdf 2023-07-11
10 202341046543-Proof of Right [24-07-2023(online)].pdf 2023-07-24
11 202341046543-FORM-26 [24-07-2023(online)].pdf 2023-07-24
12 202341046543-STARTUP [13-10-2023(online)].pdf 2023-10-13
13 202341046543-FORM28 [13-10-2023(online)].pdf 2023-10-13
14 202341046543-FORM-9 [13-10-2023(online)].pdf 2023-10-13
15 202341046543-FORM 18A [13-10-2023(online)].pdf 2023-10-13
16 202341046543-FER.pdf 2025-02-26
17 202341046543-FORM 3 [28-02-2025(online)].pdf 2025-02-28
18 202341046543-Proof of Right [23-07-2025(online)].pdf 2025-07-23
19 202341046543-RELEVANT DOCUMENTS [08-08-2025(online)].pdf 2025-08-08
20 202341046543-PETITION UNDER RULE 137 [08-08-2025(online)].pdf 2025-08-08
21 202341046543-OTHERS [08-08-2025(online)].pdf 2025-08-08
22 202341046543-FER_SER_REPLY [08-08-2025(online)].pdf 2025-08-08
23 202341046543-CLAIMS [08-08-2025(online)].pdf 2025-08-08
24 202341046543-US(14)-HearingNotice-(HearingDate-24-11-2025).pdf 2025-10-31
25 202341046543-FORM-26 [17-11-2025(online)].pdf 2025-11-17
26 202341046543-Correspondence to notify the Controller [17-11-2025(online)].pdf 2025-11-17

Search Strategy

1 202341046543_SearchStrategyNew_E_SearchHistoryE_13-02-2025.pdf