Abstract: An autonomous emergency response device for forest fire detection and evacuation comprises a mobile housing 101 placed in forest area, a sensing module 102 paired with an imaging unit 103 for detecting smoke, flames, gas, humidity, and temperature variations, multiple primary motorized omnidirectional wheels 106 moves the housing 101, a fire suppression unit 107 includes a multi-section chamber 107a stored with water and fire extinguishing foam connected to an electronic valve 107b arranged with each section via a hose 107c for selectively releases water or foam, an evacuation module 108 having an inverted L-shaped telescopic rod 108c with a cylindrical expandable body 108a for evacuating human/wildlife to the safe zone, a bioacoustic module comprising a microphone array 109 for detection of species-specific audio signature recognition, multiple directional bioacoustic emitters 110 to broadcast species-specific distress or guidance signals for wildlife evacuation.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to an autonomous emergency response device for forest fire detection and evacuation that is capable of detecting and extinguish the fire in the forest area without requiring human intervention while effectively preventing damage to wildlife or civilian settlements in nearby areas.
BACKGROUND OF THE INVENTION
[0002] Forest fires are typically caused by a combination of dry conditions, high temperatures, and the presence of flammable materials such as dry grass, leaves, and wood. They are also ignited by human activities like unattended campfires, discarded cigarettes, or lightning strikes. Once started, these fires spread rapidly, fueled by wind and dense vegetation, making them extremely difficult to control. Forest fires pose a severe threat to wildlife, destroying vast areas of wildlife habitat, endangering human lives, and causing significant economic losses.
[0003] Traditional methods for extinguishing forest fires typically involve human firefighters using water hoses, fire retardants dropped by planes, and creating firebreaks by clearing vegetation to stop fire spread. Evacuation usually relies on manual alerts through sirens, radios, or emergency personnel guiding people and animals to safety. However, these methods have several flaws. These methods depend heavily on human intervention, which is slow and risky in rapidly spreading fires. Limited accessibility in dense forests hampers quick firefighting and evacuation efforts. Moreover, wildlife evacuation is largely uncoordinated, leading to animal injuries or deaths.
[0004] FR2656802A1 discloses automatic devices for combating forest fires. A device according to the invention comprises a source of pressurized water, for example a water tank, connected by a motorized valve to bottles of compressed gas. It comprises a network of buried pipes which supply a plurality of telescopic columns comprising coaxial sliding tubes carrying a water diffuser. At rest, the sliding tubes are folded into a casing driven into the ground. The upper end of the casings is connected to the pipe network. The outer sliding tube of each column carries at the periphery of its lower end a grooved guide ring. One application is the fire protection of wooded areas such as parks, rest areas or the environment of a residence.
[0005] Conventionally, many devices are disclosed in prior art that helps in extinguishing the fire in forest area, however existing devices often fails to detect the presence of the fire in the forest which leads to a failure of timely rescue. Additionally, the existing device is unable to detect the human or wildlife in forest which causes a danger to them and extinguish the fire with only water which leads an operation failure in case of dense fire case.
[0006] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a device that requires to be capable of extinguishing the fire in forest area without any manual guidance while effectively detecting the fire types. Additionally, the developed device should be able to evacuate human or wildlife from the fire zones to safe zone in order to provide proper safety in fire hazards and continuously alerts the emergency situation to the rescue operators.
OBJECTS OF THE INVENTION
[0007] An object of the present invention is to develop a device that is capable of detecting the fire hazards and extinguish the fire at time in order to to prevent damage to vegetation, thereby minimizing the risk of fire spreading to surrounding areas.
[0008] Another object of the present invention is to develop a device that efficiently detect the presence of the human or wildlife and evacuate them from the fire zones to safe zone in order to prevent injuries and enhances the safety of the humans and animals in forest fire situations.
[0009] Yet another object of the present invention is to develop a device that helps in avoiding any obstacle in path during the fire extinguishing operation, thereby enhancing efficiency of the operation and significantly reducing the time to rescue the humans and wildlife.
[0010] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0011] The present invention relates to an autonomous emergency response device for forest fire detection and evacuation that is designed to effectively monitor fire hazards in the forest area and helps to extinguish the fire at time while effectively recognizing and evacuating the wildlife species at the safe zone from the fire.
[0012] According to an aspect of the present invention, an autonomous emergency response device for forest fire detection and evacuation comprises of a mobile housing which placed with in a forest area, a sensing module including a flame sensor, smoke sensor, humidity sensor, gas sensor, and temperature sensor is paired with an artificial intelligence-based imaging unit for detecting smoke, flames, gas, humidity, and temperature variations, a ball and socket joint is installed between the imaging unit and a link positioned over the housing for providing multi-directional motion to the imaging unit for appropriately scanning the area and detect fire type based on different material flames including but not limited to grass and wood, using image recognition protocols, an inbuilt microcontroller paired with the sensing module and imaging unit for detecting a hazardous fire condition nearby based on detected presence of smoke, flame, reduced levels of humidity and exceeding gas levels temperature, a plurality of primary motorized omnidirectional wheels positioned underneath the housing for maneuvering and positioning the housing near the area with fire conditions, a pair of robotic arms coupled with a flap installed on the housing for clearing obstructions in path of the housing detected via the imaging unit and an obstacle detection module, a fire suppression unit is installed on the housing and includes a multi-section chamber stored with water and fire extinguishing foam in separate sections, an electronic valve connected with each of the sections via a hose for selective release of water or foam based on fire type and severity as detected via imaging unit and discharge controlled water and foam discharge based on real-time inputs from sensing module detecting temperature spikes, flames, or toxic gases to suppress fires, an evacuation module for safe evacuation of a human/wildlife to a pre-saved safe zone and comprises a plurality of proximity sensors arranged on the housing to detect presence of a human/wildlife, a cylindrical expandable body installed with the housing via an inverted L-shaped telescopic rod to retract to enclose the human/wildlife.
[0013] The device further comprises a drawer arrangement to expand the body based on dimensions detected via the imaging unit and an IR (infrared) sensor installed on the housing, a pair of extendable bars is installed on opposite inner side of the body and each configured with a plate to extend for joining the plate via electromagnets for forming a continuous platform for seating of the human/wildlife, a pair of sliders is coupled with the rods for translating the rods to lift the plates on detection of human/wildlife over the plates via a weight sensor embedded with each of the plates, a plurality of secondary motorized omnidirectional wheels are coupled with the body and actuated in sync with the primary wheels to maneuver the body along with housing for evacuating the human/wildlife to the safe zone, a bioacoustics module comprising a microphone array coupled with the microcontroller for detection of species-specific audio signature recognition, a plurality of directional bioacoustic emitters installed on the housing to broadcast species-specific distress or guidance signals for wildlife evacuation on recognition of audio signature, a communication module to transmit data and alerts to remote emergency control centers regarding detected hazardous fire condition along with location tracked via a GPS module installed on the body and helps in navigating the housing along predefined or dynamically optimized routes in forested areas using GPS tracking, a plurality of ultrasonic and PIR sensors integrated with the housing to assess wildlife response and adapting the acoustic signals, a self-deploying communication relay unit to establish a dynamic communication mesh network in remote or damaged forest regions having reduced signal strengths.
[0014] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of an autonomous emergency response device for forest fire detection and evacuation; and
Figure 2 illustrates an inner view of a cylindrical expandable body associated with the device.
DETAILED DESCRIPTION OF THE INVENTION
[0016] The present invention relates to an autonomous emergency response device for forest fire detection and evacuation that helps in extinguishing the fire while recognizing and protecting the wildlife species without any human efforts and enhances the network connectivity for conveying the alerts to the rescue operators placed at distance.
[0017] Referring to Figure 1 and Figure 2 an isometric view of an autonomous emergency response device for forest fire detection and evacuation and an inner view of a cylindrical expandable body associated with the device are illustrated, respectively, comprising a mobile housing 101, a sensing module 102 is installed on the housing 101 and comprises a flame sensor 102a, smoke sensor 102b, humidity sensor 102c, gas sensor 102d, and temperature sensor 102e, an artificial intelligence-based imaging unit 103 is installed over the housing 101 via a link 104 having ball and socket joint 105, a plurality of primary motorized omnidirectional wheels 106 is installed underneath the housing 101.
[0018] Figure 1 and 2 further illustrates, a fire suppression unit 107 is installed on the housing 101 and comprises a multi-section chamber 107a connected to an electronic valve 107b by a hose 107c, an evacuation module 108 is installed on the housing 101 and comprises a cylindrical expandable body 108a integrated with a drawer arrangement 108b installed with the housing 101 via an inverted L-shaped telescopic rod 108c where a pair of extendable bars 201 installed on opposite inner side of the body 108a attached with a plate 202 integrated with electromagnets 203, a pair of sliders 204 is coupled with the bars, a plurality of secondary motorized omnidirectional wheels 108d is installed underneath the body 108a, a microphone array 109 and a plurality of directional bioacoustic emitters 110 are installed with the housing 101, an antenna 111 is mounted on a telescopic pole 112 arranged on the housing 101, a pair of robotic arms 113 with a flap 114 is installed on the housing 101.
[0019] The device disclosed herein comprises a mobile housing 101 which is placed with in a forest area. The mobile housing 101 is constructed using lightweight, fire-retardant composite materials, such as aluminum alloy frames with reinforced carbon fiber panels. These materials provide high structural strength while keeping the unit lightweight for efficient mobility.
[0020] An inbuilt microcontroller is integrated within the housing 101. The microcontroller, used herein, is preferably an Arduino microcontroller. The Arduino microcontroller used herein controls the overall functionality of the linked components. The microcontroller is integrated with multiple machine learning protocols and models for performing complex data analysis.
[0021] A sensing module 102 is integrated with the housing 101 for detecting smoke, flames, gas, humidity, and temperature variations. The sensing module 102 here includes a flame sensor 102a, smoke sensor 102b, humidity sensor 102c, gas sensor 102d, and temperature sensor 102e for effectively detect the fire hazards.
[0022] The flame sensor 102a mentioned above detects specific wavelengths of light emitted by a flame, typically in the ultraviolet (UV) or infrared (IR) spectrum. When a flame is present, the flame sensor’s photodiode or phototransistor registers rapid changes in light intensity and produces an analog or digital signal. An internal signal processing circuit filters out false positives, which includes sunlight or ambient light, to ensure accurate flame detection. The flame sensor 102a continuously monitors the surrounding area. Upon detection of a valid flame signature, the microcontroller sends the processed signal output to the microcontroller.
[0023] Simultaneously, the smoke sensor 102b detects smoke by using the principle of light scattering. Inside the smoke sensor’s chamber, an infrared LED emits a light beam across a dark, open space. Under normal conditions, the light passes straight and does not hit the photodiode placed at an angle. When smoke enters the chamber, the particles scatter the light, causing the smoke to strike the photodiode. This generates an electric signal proportional to the smoke density. The smoke sensor ’s internal circuit processes this signal to remove background noise and ensure accurate detection. Once smoke is detected beyond a preset threshold, the smoke sensor 102b sends a digital signal to the microcontroller.
[0024] The humidity sensor 102c, typically a capacitive type, measures the relative humidity (RH) in the surrounding air by detecting changes in electrical capacitance. The humidity sensor 102c consists of a thin hygroscopic polymer film sandwiched between two conductive plates. As moisture in the air is absorbed by the film, its dielectric constant changes, altering the humidity sensor 's capacitance. This change is converted into a corresponding analog voltage by an internal signal processing circuit. The humidity sensor 102c continuously monitors environmental humidity. The processed data is then fed to the microcontroller.
[0025] Also, the gas sensor 102d simultaneously detects the gas level in the air by using a metal oxide semiconductor (typically SnO₂) as the sensing element. In clean air, the gas sensor’s surface absorbs oxygen, which traps free electrons, creating a high resistance. When a target gas (like CO, CH₄, or smoke gases) is present, it reacts with the adsorbed oxygen, releasing the trapped electrons back into the gas sensor’s material. This reduces the resistance of the gas sensor 102d. The change in resistance is proportional to the gas concentration and is converted into an analog voltage signal by the gas sensor’s internal circuit. The gas sensor 102d performs continuous sampling and maintains stable operation through an internal heater element that ensures proper operating temperature. The output signal is sent to the microcontroller’s analog input, which interprets the gas data.
[0026] Simultaneously, the temperature sensor 102e detects ambient temperature using a Negative Temperature Coefficient (NTC) thermistor, which is composed of temperature-sensitive ceramic materials. As the surrounding temperature increases, the resistance of the thermistor decreases in a predictable manner. The temperature sensor 102e is integrated into a voltage divider circuit, where this resistance change causes a proportional change in output voltage. This analog voltage is continuously monitored and fed into the analog-to-digital converter (ADC) of the microcontroller. The microcontroller converts this signal into an accurate temperature reading, which is processed in real time.
[0027] An artificial intelligence-based imaging unit 103 is installed over the housing 101 via a link 104 having ball and socket joint 105 for providing multi-directional motion to the imaging unit 103 for appropriately scanning the area and works in sync with the sensing module 102 for detecting fire type based on different material flames. In sync of the sensing module 102, the microcontroller sends a signal to activate the imaging unit 103 and the ball and socket joint 105.
[0028] The ball and socket joint 105 provide a 360-degree rotation to the imaging unit 103 to orient at any desired angle for effective monitoring. The ball and socket joint 105 comprise a spherical ball securely housed within a matching socket, allowing smooth multidirectional movement. The ball and socket joint 105 connected to a DC motor rotates freely within the socket, delivering the required rotational motion to the imaging unit 103. Upon actuation, the motor drives the rotation by applying torque to the ball joint for enabling the imaging unit 103 to move at desired angle and scan different areas around the housing 101 for precise detection of fire hazards.
[0029] The artificial intelligence-based imaging unit 103 here includes a high-resolution camera. As the ball and socket joint 105 provides movement to the imaging unit 103, the camera captures real-time images of the surroundings of the housing 101 from multiple angles. These images are transmitted to a processor which processes these images using image recognition protocols and analyses key flame characteristics such as color, shape, and intensity to detect and classify fire types. The protocols compare the visual data against pre-trained flame profiles and cross-reference these findings with data from the sensing module 102 such as temperature spikes, smoke density, and gas concentrations to validate fire presence and assess its severity and location accurately. The microcontroller processes combined inputs from the imaging unit 103 and other sensors to estimate the fire’s location relative to the housing 101 by analysing spatial data and sensors activation patterns, enabling precise targeting for suppression and evacuation efforts.
[0030] A plurality of primary motorized omnidirectional wheels 106 is installed underneath the housing 101 for maneuvering and positioning the housing 101 near the area with fire conditions. Each primary motorized omnidirectional wheel 106 comprises an electric motor which is coupled with a precision gear unit. Upon detecting fire, the microcontroller sends a signal to actuate the primary motorized omnidirectional wheels 106. Upon actuation, the motors transmit torque directly to the wheel hub, enabling smooth and controlled motion of the housing 101. The motor receives continuous control signals from the microcontroller, which adjusts the power output to ensure accurate movement and position the housing 101 in detected fire zones.
[0031] A pair of robotic arms 113 are installed on the housing 101 and each robotic arm 113 is arranged with a flap 114 for clearing obstructions in path of the housing 101 which is detected via the imaging unit 103 and an obstacle detection module. During the movement of the housing 101 towards the fire area, the microcontroller sends a signal to activate the object detection module which continuously detect the obstacle in the path of the housing 101 in sync with the imaging unit 103.
[0032] The object detection module mentioned above detects the obstacles in path with the help of the data from the imaging unit 103 by using object detection module. During the movement of the housing 101 towards the fire area, the imaging unit 103 continuously captures real-time images of the housing’s path. The images are processed by the processor using obstacle detection protocols, which analyse the visual data to identify objects blocking the path. These protocols detect obstacles by recognizing shapes, edges, and contrasts within the environment, distinguishing potential obstructions from natural surroundings. The microcontroller then calculates the position and size of the detected obstacles relative to the housing 101.
[0033] In an exemplary embodiment of the present invention, the obstacle detection module detects obstacles in the path by utilizing real-time data from the imaging unit 103. As the housing 101 moves toward the fire-affected area, the imaging unit 103 continuously captures live images of the terrain ahead. For instance, when a fallen branch lies across the path, the processor analyses the captured images using obstacle detection protocols that identify the branch by recognizing its unique shape, edges, and contrast against the forest floor. These protocols differentiate the branch from natural elements such as leaves or shadows. The microcontroller then calculates the precise position and size of the branch relative to the housing 101.
[0034] Upon detection of the obstacles in the path, the microcontroller sends a signal to actuate the robotic arms 113. The robotic arms 113 are driven by a series of electric motors that actuate interconnected mechanical linkages controlling extension, rotation, and precise positioning of the arms 113. These motors convert electrical signals into rotational motion, which is transmitted through the linkages to maneuver the arms 113 and flaps 114 accurately. This coordinated movement enables the robotic arms 113 to extend outward, position the flaps 114 against the obstacle, and apply controlled force to push, lift, or move debris away from the housing’s trajectory, ensuring smooth navigation and minimizing delays during fire response.
[0035] As the housing is positioned in the fire zone, the microcontroller commands the imaging unit 103 to detect the severity and type of the fire. The imaging unit helps to analyze the fire condition and makes the extinguishing operation easier and more effective.
[0036] In an exemplary embodiment of the present invention, the imaging unit 103 captures real-time images of the surrounding area and transmits them to a processor for analysis. When detecting a grass fire, the image recognition protocols analyse the flame’s characteristics such as its lighter color, rapid flickering motion, and scattered flame pattern. These features are compared against pre-trained profiles specific to grass fires. The microcontroller processes this combined data of imaging unit 103 and the sensors to accurately locate the grass fire relative to the housing 101, enabling appropriate fire suppression measures.
[0037] In another exemplary embodiment of the present invention, captures real-time images of the surrounding area and transmits them to a processor for analysis. The imaging unit 103 identifies flames exhibiting a denser, orange-red hue with slower, sustained burning patterns. The image recognition protocols match these observations with pre-trained a wood fire profile. Corresponding the sensor data indicating higher temperature spikes, thick smoke concentration, and specific gas emissions are validated alongside the visual input. The microcontroller integrates these signals to precisely determine the fire’s location and intensity.
[0038] A fire suppression unit 107 is installed on the housing 101 to to effectively control and extinguish detected fire. The fire suppression unit 107 comprises a multi-section chamber 107a stored with water and fire extinguishing foam in separate sections. an electronic valve 107b is connected with each of the sections via a hose 107c for selective release of water or foam based on fire type and severity as detected via imaging unit 103. Upon detection of the grass with low-intensity flames is detected by the imaging unit 103, the microcontroller sends a signal to activate the electronic valve 107b connected to the water chamber 107a. Upon actuation, the electric current energizes the solenoid coil within the electronic valve 107b connected to the water section of the chamber 107a.
[0039] This energization creates a magnetic field that actuates the valve 107b, causing the valve 107b to open and allow water to flow through the hose 107c over the fire. Water continues to be dispensed for a duration controlled by the microcontroller, which constantly receives feedback from the imaging unit 103 and the sensors monitoring fire status by detecting temperature spikes, flames, or toxic gases to suppress fires or reduce intensity for safe evacuation paths. Once the imaging unit 103 confirms the fire has diminished or extinguished, the microcontroller de-energizes the solenoid coil, closing the valve 107b and stopping water flow.
[0040] In case the imaging unit 103 detects a wood fire with high-intensity flames, the microcontroller sends a signal to energize the solenoid coil within the electronic valve 107b connected to the foam section of the chamber 107a. Energizing the coil creates a magnetic field that actuates the valve 107b, opening the valve 107b to allow foam concentrate to flow through the discharge hose 107c. The foam is then mixed with air via the air-aspirating venturi valve to produce a thick, stable foam blanket. Foam is dispensed for a controlled duration while the imaging unit 103 and sensors continuously monitor fire conditions by detecting temperature spikes, flames, or toxic gases to suppress fires or reduce intensity for safe evacuation paths. Once the fire intensity decreases or is extinguished, the microcontroller de-energizes the solenoid coil, closing the valve 107b to stop foam discharge, ensuring efficient and targeted fire suppression.
[0041] An evacuation module 108 is installed on the housing 101 configured for safe evacuation of a human or wildlife to a pre-saved safe zone. The evacuation module 108 comprises a plurality of proximity sensors arranged on the housing 101 and coupled with the primary motorized omnidirectional wheels 106 to detect presence of a human/wildlife in proximity and positioning the housing 101 near the human/wildlife. During extinguishing the fire, the microcontroller sends a signal to activate the proximity sensors. Upon activation, the proximity sensors emit ultrasonic waves towards the surroundings of the housing 101.
[0042] If a human or wildlife enters the detection zone, the emitted signals reflect off the body of the human or wildlife and return to the proximity sensors, where the time delay or intensity of the returned signal is measured to calculate the distance and presence of the human or wildlife. This data is processed to differentiate between living beings and inanimate objects based on signal patterns and proximity thresholds. The imaging unit 103 continuously provides visual data of the area. The microcontroller then compares the visual and proximity sensor’s data and detect the presence of human or wildlife, then the microcontroller guides the primary motorized omnidirectional wheels 106, maneuvering the housing 101 closer to the detected human or wildlife to enable safe evacuation.
[0043] A cylindrical expandable body 108a is installed with the housing 101 via an inverted L-shaped telescopic rod 108c to retract for enclosing the human/wildlife in the body 108a. Upon positioning of the housing 101, the microcontroller sends a signal to actuate the L-shaped telescopic rod 108c which is powered by a pneumatic unit. The pneumatic unit includes a pneumatic cylinder containing a piston connected to a rod 108c. upon actuation, compressed air is supplied from an air compressor through directional control valves to either side of the piston inside the cylinder. Upon actuation, compressed air is directed into the rear chamber of the cylinder, pulling back the piston and causing the piston to retract the telescopic rod 108c. This retraction lowers the cylindrical body 108a, enclosing the human or wildlife securely within the expandable body 108a, providing a safe and protective enclosure for evacuation.
[0044] An IR (infrared) sensor is integrated with the housing 101 to detect body 108a dimensions of the human/wildlife. As the body 108a encloses the human or wildlife, the microcontroller sends a signal to the IR (infrared) sensor. Upon activation, the integrated infrared (IR) sensor emits a focused beam of infrared light towards the detected human or wildlife. This IR light reflects off the surface of the body 108a and returns to the IR (infrared) sensor, which measures the intensity and time delay of the reflected signals. By scanning across multiple angles or using an array of IR sensors, the microcontroller collects data points to create a three-dimensional profile of the body’s shape and size. The microcontroller processes this reflected data to calculate precise body 108a dimensions such as height, width, and overall volume. This information is continuously updated in real-time as the human or wildlife moves, ensuring accurate measurement.
[0045] A drawer arrangement 108b is integrated with the cylindrical expandable body 108a to expand based on body 108a dimensions of the human or wildlife detected by the IR (infrared) sensor. The drawer arrangement 108b consists of a drawer that slides smoothly along rails fixed inside the plate 202 structure. These rails provide a stable and guided path for the expansion and compression of the body 108a. Upon detection of the dimensions of human or wildlife, the microcontroller sends a signal to the drawer arrangement 108b. Upon actuation, the microcontroller powers a motor whose rotational motion is converted into linear motion through a gear arrangement. As the motor rotates, the segments of body 108a moves outward or inward along the rails, causing the body 108a to expand or contract. This controlled motion allows the cylindrical body 108a to adjust precisely, ensuring a snug and secure fit around the human or wildlife during evacuation.
[0046] A pair of extendable bars 201 are installed on opposite inner side of the body 108a and each coupled with a plate 202. The bars 201 extend for joining the plate 202 via electromagnets 203 and forms a continuous platform for seating of the human/wildlife. Upon securing the human or wildlife, the microcontroller sends a signal to actuate the extendable bars 201 which are powered by hydraulic actuators. Upon actuation, the hydraulic actuators are energized to pump hydraulic fluid into cylinders connected to the extendable bars 201. This fluid pressure drives pistons that smoothly extend the extendable bars 201 outward along a guided track. The controlled extension ensures precise positioning of the extendable bars 201, allowing the attached plates 202 to move toward each other.
[0047] Once the bars are fully extended and the plates 202 are in position, the microcontroller sends an electric signal to actuate the electromagnets 203. When energized, the electromagnets 203 generate a strong magnetic field that causes the plates 202 on the opposing bars to securely join together, creating a unified seating surface. The microcontroller maintains power to the electromagnets 203 during the evacuation process, ensuring the platform remains stable and locked. This precise control flow guarantees safe seating and seamless operation throughout evacuation.
[0048] A weight sensor is integrated with the plates 202 to detect the presence of the human or wildlife over the plates 202. During the evacuation operation, the microcontroller sends a signal to activate the weight sensor. The weight sensor here uses a load cell to detect the presence of the human or wildlife. The load cell consists of strain gauges that deform under the weight of the human or wildlife, producing a change in electrical resistance. This change is converted into an electrical signal proportional to the load. The weight sensor continuously sends this signal to the microcontroller, which confirms the presence of the human or wildlife.
[0049] A pair of sliders 204 are coupled with the bars for translating the bars 201 to lift the plates 202 on detection of human/wildlife over the plates 202. As the human or wildlife presence over the plates 202 gets detected, the microcontroller sends a signal to actuate the sliders 204. A small DC motor is connected to each of the sliders 204 which is powered by the microcontroller. As the motor rotates, the rotational motion of these motors is converted into linear motion via a lead screw arrangement, causing the sliders 204 to translate vertically along the guide rails. As the sliders 204 move upward, they gently lift the plates 202, raising the human or wildlife to a safe height above the surface of the cylindrical body 108a. This controlled lifting maintains a secure distance between the human or wildlife from any moving parts or external surfaces, minimizing the risk of injury during subsequent evacuation maneuver.
[0050] A plurality of secondary motorized omnidirectional wheels 108d is arranged with the body 108a that are actuated in sync with the primary wheels 106 to maneuver the body 108a along with housing 101 for evacuating the human/wildlife to the safe zone. The secondary motorized omnidirectional wheels 108d operate in the same manner as the primary wheels 106, each powered by an individual electric motor and gear assembly. As the human or wildlife is positioned safely, the secondary motorized omnidirectional wheels 108d move in coordination with the primary motorized omnidirectional wheels 106, allowing safe and stable evacuation of humans or wildlife to the safe zone.
[0051] A communication module is integrated with the microcontroller to transmit data and alerts to remote emergency control centers regarding detected hazardous fire condition along with location tracked via a GPS module installed on the body 108a. Upon hazardous fire condition and initiation of human/wildlife evacuation, the microcontroller compiles critical information including as fire severity, type, sensor readings, and evacuation status, into digital alert messages. Simultaneously, the microcontroller receives real-time GPS coordinates by continuously processing satellite signals through the GPS module, enabling accurate localization of the housing 101 within the forested area. Predefined safe routes or dynamically optimized paths, calculated based on terrain data and fire locations, are stored within the linked database. Using the current GPS position and destination waypoints, the microcontroller autonomously plans and controls the movement of the housing 101 by sending precise actuation commands to the primary motorized omnidirectional wheels 106, adjusting speed and direction to navigate obstacles and reach target locations efficiently.
[0052] The microcontroller integrates data from the sensing module 102 and live video feed captured by the imaging unit 103, transmitting this information to remote emergency control centers via the communication module to ensure safer area. This continuous exchange allows real-time inspection and monitoring of forest conditions, enabling quick decision-making and coordinated firefighting efforts. The GPS-based navigation ensures the housing 101 arrives accurately at designated fire zones or inspection points, optimizing fire suppression and evacuation operations.
[0053] A bioacoustic module is installed on the housing 101 and comprises a microphone array 109 coupled with the microcontroller for detection of species-specific audio signature recognition. During the fire extinguishing and evacuation operation, the microcontroller sends a signal to activate the microphone array 109. The microphone array 109 includes high-sensitivity directional microphone arranged to provide 360-degree spatial coverage, allowing to continuously capture ambient forest sounds, including vocalizations and calls of various wildlife species. The raw audio signals from the microphones are digitized and undergo preprocessing steps such as noise filtering to eliminate background noise, echo cancellation to reduce sound reflections, and beamforming protocols to isolate sounds from specific directions.
[0054] The microcontroller extracts distinctive audio features, including Mel-frequency cepstral coefficients (MFCCs), spectrogram patterns, pitch, duration, and harmonic content, which serve as unique identifiers for different species’ calls. These features are then analysed by an AI-based classification running on the microcontroller, which has been pre-trained on a database of species-specific vocalizations relevant to the forest environment. Using machine learning protocols, the classifier compares extracted features with stored species-specific audio signatures to accurately recognize the species or disregard irrelevant sounds. The recognition results are then relayed to the microcontroller for further action.
[0055] A plurality of directional bioacoustic emitters 110 is installed on the housing 101 to broadcast species-specific distress or guidance signals for wildlife evacuation on recognition of audio signature. Once the bioacoustic module gets detected and identifies a species through its audio signature recognition, the microcontroller retrieves a pre-stored library of distress corresponding to the identified species in linked database. The directional emitters 110 focus sound waves precisely in specific directions, are then electronically activated to emit the species-specific distress, ensuring that the signals are effectively directed toward the detected wildlife in the vicinity. This focused emission minimizes sound dispersion and environmental noise interference, thereby increasing the likelihood of successful communication. The emitted distress or guidance calls serve to alert, calm, or lead animals away from fire-affected areas toward safer zones, thereby facilitating organized and species-appropriate evacuation during emergency conditions.
[0056] A plurality of ultrasonic and PIR sensors is integrated with the housing 101 to assess wildlife response and adapting the acoustic signals accordingly. During emitting the species-specific distress, the microcontroller sends a signal to activate the ultrasonic and PIR sensors. Upon activation, the ultrasonic sensor emits high-frequency sound waves beyond the range of human hearing. These waves travel through the air and reflect off nearby objects, including animals. The ultrasonic sensor then measures the time interval between the emission and reception of these echoes to calculate the distance and detect the movement of wildlife in proximity to the housing 101. This allows to precisely monitor whether animals are moving away from the danger zone after distress signals are emitted. If the ultrasonic sensors detect that the wildlife remains within the area, this information is fed back to the microcontroller.
[0057] Simultaneously, the Passive Infrared (PIR) sensor detects changes in infrared radiation, which is emitted as heat by living beings. When wildlife or humans enter the sensor’s field of view, the PIR sensor senses the sudden changes in body 108a heat signatures, confirming the presence of warm-blooded animals near the housing 101. This thermal detection complements the ultrasonic data by verifying the continuous presence of wildlife in the hazard area, even if movement is minimal or slow. If the PIR sensor detects persistent heat signatures after the initial distress signals, the PIR sensor assumes the animals have not evacuated. This data is then fed to the microcontroller. In response, the microcontroller compares data from both sensors and commands the bioacoustic emitters 110 to intensify or repeat the distress calls, thereby encouraging effective evacuation.
[0058] A self-deploying communication relay module is integrated with the housing 101 for signal blackout zones and comprises a plurality of signal strength sensors for detecting signal strength. During the fire extinguishing and evacuation operation, the microcontroller sends a signal to activate signal strength sensors. The signal strength sensors continuously monitor wireless communication signals in the housing’s surrounding environment by measuring parameters such as signal amplitude, signal-to-noise ratio (SNR), and bit error rates. The signal strength sensors periodically sample the incoming signal frequencies to assess the quality and strength of available communication channels. When the signal strength sensors detect that the signal strength drops below a preset threshold indicating weak or unreliable connectivity, the signal strength sensors send a real-time alert to the microcontroller.
[0059] A retractable wireless mesh networking relay unit with an antenna 111 is mounted on a telescopic pole 112 to establish a dynamic communication mesh network in remote or damaged forest regions having reduced signal strengths. Upon detection of weak or unreliable connectivity, the microcontroller sends a signal to activate the retractable wireless mesh networking relay unit and telescopic pole 112. The telescopic pole 112 works in same manner as the L-shaped telescopic rod 108c to extend vertically from the housing 101. This mechanical extension raises the wireless mesh networking relay unit and the antenna 111 above ground level and surrounding obstacles such as dense vegetation or uneven terrain, positioning the antenna 111 to achieve optimal line-of-sight communication with other network nodes and control centres.
[0060] Once extended, the wireless mesh relay unit activates to dynamically establish and maintain a robust mesh network by connecting with nearby relay nodes. This multi-hop network selects the most reliable communication paths, allowing data packets, including sensor readings, alerts, and live video feeds, to be transmitted efficiently across challenging forested or damaged environments. By elevating the antenna 111, the wireless mesh relay unit significantly improves signal strength and coverage area, reducing interference and packet loss, thus ensuring continuous and stable communication even in remote or signal blackout zones.
[0061] Lastly, a battery is associated with the device to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrodes named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the device.
[0062] The present invention works best in the following manner, where the mobile housing 101 is placed in forested area. The sensing module 102 comprising flame sensor 102a, smoke sensor 102b, humidity sensor 102c, gas sensor 102d, and temperature sensor 102e, detects fire hazards in real time. Simultaneously, the artificial intelligence-based imaging unit 103 mounted on the ball-and-socket joint 105 by the link 104 captures real-time visuals of the fire area and helps in analysing grass or wood fires. On confirming the fire, the plurality of primary motorized omnidirectional wheels 106 to move the housing 101 towards the fire zone. The pair of robotic arms 113 with the flaps 114, guided by the obstacle detection module, clear path obstacles. The fire suppression unit 107 containing multi-section chamber 107a stored with water and fire extinguishing foam in separate sections selectively releases the appropriate suppressant by the electronic valve 107b through the hose 107c based on fire type. The evacuation module 108 comprising the plurality of proximity sensors detects the human or wildlife presence on which the inverted L-shaped telescopic rod 108c retracts the cylindrical expandable body 108a over the human or wildlife for providing safe enclosure. The IR (infrared) sensor detects the dimensions of human/wildlife the drawer arrangement 108b expands the body 108a accordingly. The plurality of extendable bars 201 brings the plates 202 close to each other and electromagnets 203 create the secure seating platform.
[0063] In continuation, the weight sensor confirms the human or wildlife presence, triggering the pair of sliders 204 to lift the seat. Then the plurality of secondary motorized omnidirectional wheels 108d maneuver the body 108a with the housing 101 toward the predefined safe zone. The communication module integrated with the GPS module transmits informs the emergency control centers regarding fire severity, evacuation status, and housing 101 location. The bioacoustic module with the microphone array 109 detects species-specific audio signature recognition and then the plurality of directional bioacoustic emitters 110 emit species-specific distress or guidance calls to encourage wildlife evacuation. The plurality of ultrasonic and PIR sensors monitors animal movement and presence in order to regulate the acoustic signals accordingly. In signal blackout zones, the plurality of signal strength sensors detects weak communication levels and then the telescopic pole 112 raising the wireless mesh networking relay unit with the antenna 111 to enhance connectivity continuous data flow even in obstructed or remote environments.
[0064] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) An autonomous emergency response device for forest fire detection and evacuation, comprising:
i) a mobile housing 101 placed with in a forest area and installed with a sensing module 102 paired with an artificial intelligence-based imaging unit 103 for detecting smoke, flames, gas, humidity, and temperature variations;
ii) an inbuilt microcontroller paired with the sensing module 102 and imaging unit 103 for detecting a hazardous fire condition nearby based on detected presence of smoke, flame, reduced levels of humidity, and exceeding gas levels temperature, and accordingly a plurality of primary motorized omnidirectional wheels 106 configured underneath the housing 101 are actuated for maneuvering and positioning the housing 101 near the area with fire conditions;
iii) a fire suppression unit 107 installed on the housing 101, the suppressing unit comprising:
a) a multi-section chamber 107a stored with water and fire extinguishing foam in separate sections; and
b) an electronic valve 107b connected with each of the sections via a hose 107c for selective release of water or foam based on fire type and severity as detected via imaging unit 103.
iv) an evacuation module 108 installed on the housing 101 configured for safe evacuation of a human or wildlife to a pre-saved safe zone, the evacuation module 108 comprising:
a) a plurality of proximity sensors arranged on the housing 101 and coupled with the primary wheels 106 to detect presence of a human/wildlife in proximity, and positioning the housing 101 near the human/wildlife;
b) a cylindrical expandable body 108a installed with the housing 101 via an inverted L-shaped telescopic rod 108c, the rod configured to retract to enclosure the human/wildlife in the body 108a and the body 108a configured to expand based on body 108a dimensions of the human/wildlife via a drawer arrangement 108b as detected via the imaging unit 103 in sync with an IR (infrared) sensor installed on the housing 101;
c) a pair of extendable bars 201 installed on opposite inner side of the body 108a and each configured with a plate 202, the bars configured to extend for joining the plate 202 via electromagnets 203, forming a continuous platform for seating of the human/wildlife;
d) a pair of sliders 204 coupled with the rods for translating the rods to lift the plates 202 on detection of human/wildlife over the plates 202 via a weight sensor embedded with each of the plates 202; and
e) a plurality of secondary motorized omnidirectional wheels 108d that are actuated in sync with the primary wheels 106 to maneuver the body 108a along with housing 101 for evacuating the human/wildlife to the safe zone.
v) a bioacoustic module comprising a microphone array 109 coupled with the microcontroller is installed on the housing 101 detection of species-specific audio signature recognition; and
vi) a plurality of directional bioacoustic emitters 110 to broadcast species-specific distress or guidance signals for wildlife evacuation on recognition of audio signature.
2) The device as claimed in claim 1, wherein the sensing module 102 includes a flame sensor 102a, smoke sensor 102b, humidity sensor 102c, gas sensor 102d, and temperature sensor 102e.
3) The device as claimed in claim 1, wherein the imaging unit 103 is configured to detect fire type based on different material flames including, but not limited to grass and wood, using image recognition protocols.
4) The device as claimed in claim 1, wherein the imaging unit 103 installed over the housing 101 via a link 104 having ball and socket joint 105 for providing multi-directional motion to the imaging unit 103 for appropriately scanning the area.
5) The device as claimed in claim 1, wherein the valves 107b controlling water and foam discharge are electronically actuated based on real-time inputs from sensing module 102 detecting temperature spikes, flames, or toxic gases to suppress fires or reduce intensity for safe evacuation paths.
6) The device as claimed in claim 1, further comprising a communication module integrated with the microcontroller configured to transmit data and alerts to remote emergency control centers regarding detected hazardous fire condition along with location tracked via a GPS module installed on the body 108a.
7) The device as claimed in claim 1, further comprising a plurality of ultrasonic and PIR sensors to assess wildlife response and adapting the acoustic signals accordingly.
8) The device as claimed in claim 1, further comprising a self-deploying communication relay module for signal blackout zones, the relay module comprises:
a) a plurality of signal strength sensors for detecting signal strength; and
b) a retractable wireless mesh networking relay unit with an antenna 111 mounted on a telescopic pole 112 to establish a dynamic communication mesh network in remote or damaged forest regions having reduced signal strengths.
9) The device as claimed in claim 1, wherein a pair of robotic arms 113, each installed with a flap 114 is installed on the housing 101 for clearing obstructions in path of the housing 101, detected via the imaging unit 103 and an obstacle detection module.
10) The device as claimed in claim 1, wherein the microcontroller autonomously navigates the housing 101 along predefined or dynamically optimized routes in forested areas using GPS tracking, enabling real-time inspection, monitoring, and remote surveillance transmitting sensing module’s data and live video feed captured via the imaging unit 103 to the remote emergency control centers.
| # | Name | Date |
|---|---|---|
| 1 | 202521093608-STATEMENT OF UNDERTAKING (FORM 3) [29-09-2025(online)].pdf | 2025-09-29 |
| 2 | 202521093608-REQUEST FOR EXAMINATION (FORM-18) [29-09-2025(online)].pdf | 2025-09-29 |
| 3 | 202521093608-REQUEST FOR EARLY PUBLICATION(FORM-9) [29-09-2025(online)].pdf | 2025-09-29 |
| 4 | 202521093608-PROOF OF RIGHT [29-09-2025(online)].pdf | 2025-09-29 |
| 5 | 202521093608-POWER OF AUTHORITY [29-09-2025(online)].pdf | 2025-09-29 |
| 6 | 202521093608-FORM-9 [29-09-2025(online)].pdf | 2025-09-29 |
| 7 | 202521093608-FORM FOR SMALL ENTITY(FORM-28) [29-09-2025(online)].pdf | 2025-09-29 |
| 8 | 202521093608-FORM 18 [29-09-2025(online)].pdf | 2025-09-29 |
| 9 | 202521093608-FORM 1 [29-09-2025(online)].pdf | 2025-09-29 |
| 10 | 202521093608-FIGURE OF ABSTRACT [29-09-2025(online)].pdf | 2025-09-29 |
| 11 | 202521093608-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-09-2025(online)].pdf | 2025-09-29 |
| 12 | 202521093608-EVIDENCE FOR REGISTRATION UNDER SSI [29-09-2025(online)].pdf | 2025-09-29 |
| 13 | 202521093608-EDUCATIONAL INSTITUTION(S) [29-09-2025(online)].pdf | 2025-09-29 |
| 14 | 202521093608-DRAWINGS [29-09-2025(online)].pdf | 2025-09-29 |
| 15 | 202521093608-DECLARATION OF INVENTORSHIP (FORM 5) [29-09-2025(online)].pdf | 2025-09-29 |
| 16 | 202521093608-COMPLETE SPECIFICATION [29-09-2025(online)].pdf | 2025-09-29 |
| 17 | Abstract.jpg | 2025-10-10 |