Sign In to Follow Application
View All Documents & Correspondence

Adaptive Wildlife Deterrent System For Agricultural Protection

Abstract: The present disclosure relates to a wildlife deterrent system (102) for protecting designated areas from animal intrusion. The system includes a weatherproof enclosure (202) housing sensors (204) including motion sensor (204-1) and camera module (204-2) connected to a control unit (212). The control unit (212) processes captured images through detection and classification modules to identify species and activate deterrent actuators (206) including multimodal sound unit (206-1), LED light array (206-2), mechanical shaker (206-3), and olfactory deterrent (206-4). The system transmits detection data to user equipment via wireless network module (104) and receives feedback data to update neural network parameters for improved classification accuracy. The system includes designated zones with water reservoirs and audio transducers to guide deterred wildlife toward safe areas.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
12 August 2025
Publication Number
36/2025
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Amrita Vishwa Vidyapeetham
Amrita Vishwa Vidyapeetham, Amritapuri Campus, Amritapuri, Clappana PO, Kollam - 690525, Kerala, India.

Inventors

1. ABED, Niloofar
School of Sustainable Futures, Amrita University, Amritapuri PO, Kollam - 690525, Kerala, India.
2. RAMESH, Maneesha Vinodini
Edamannel 36, Vallikavu, Clappana PO, Kollam - 690525, Kerala, India.
3. DELDARI, Abtin
School of Sustainable Futures, Amrita University, Amritapuri PO, Kollam - 690525, Kerala, India.
4. MURUGAN, Ramu
6/182C, Kaveri Nagar, Pattanam, Coimbatore - 641016, Tamil Nadu, India.
5. SANKARANNAIR, Sabarinath
Gayathri Bhavan, Prayar North, Prayar PO, Oachira - 690547, Kerala, India.

Specification

Description:TECHNICAL FIELD
[0001] The present disclosure relates to the field of wildlife management and agricultural protection systems. More particularly, the present disclosure relates to a wildlife deterrent system incorporating neural network-based classification for species-specific deterrence and IoT-enabled remote monitoring capabilities.

BACKGROUND
[0002] The following description of the related art is intended to provide background information pertaining to the field of disclosure. This section may include certain aspects of the art that may be related to various features of the present disclosure. However, it should be appreciated that this section is used only to enhance the understanding of the reader with respect to the present disclosure, and not as admissions of the prior art.
[0003] Wildlife intrusion into agricultural areas presents significant challenges for crop protection and food security. Conventional deterrent systems typically employ fixed deterrent patterns without considering species-specific behavioral characteristics that influence deterrent effectiveness. These systems often fail to provide adequate protection against diverse wildlife species, resulting in continued crop damage and economic losses for agricultural communities.
[0004] Existing approaches generally utilize single-mode deterrence methods that do not account for the complex sensory characteristics of different animal species. Such systems lack real-time classification capabilities and adaptive learning mechanisms, limiting their effectiveness in diverse agricultural environments. However, recent developments in sensor technology and processing capabilities have created opportunities for more sophisticated wildlife management solutions.
[0005] Therefore, there exists a need for an improved wildlife deterrent system that can provide species-specific deterrence through intelligent processing capabilities, multimodal actuator systems, and adaptive learning mechanisms to address the limitations of conventional approaches in agricultural protection applications.

OBJECTS OF THE PRESENT DISCLOSURE
[0006] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0007] An object of the present disclosure is to provide a wildlife deterrent system that enables species-specific deterrence for enhanced agricultural protection.
[0008] Another object of the present disclosure is to provide a wildlife deterrent system that incorporates adaptive learning capabilities for continuous improvement in deterrent effectiveness.
[0009] Yet another object of the present disclosure is to provide a wildlife deterrent system that supports humane wildlife management through integrated recovery zone functionality.

SUMMARY
[0010] This section is provided to introduce certain objects and aspects of the present disclosure in a simplified form that are further described below in the detailed description. This summary is not intended to identify the key features or the scope of the claimed subject matter.
[0011] The present disclosure relates to an adaptive wildlife deterrent system for agricultural environments. The system can use neural network-based classification modules to identify wildlife species in real-time based on motion detection and visual analysis. Multimodal deterrent actuators can generate species-specific responses with dynamic parameter adjustment. IoT connectivity may enable remote monitoring and feedback-based learning.
[0012] The system can include a weatherproof enclosure housing ultrasonic, camera and environmental sensors connected to a control unit. A processing engine can execute multiple modules including detection module for object identification, classification module for species recognition, behavioral module for parameter retrieval, and deterrent module for actuator selection. These modules can generate control signals based on identified species. Multimodal actuators can produce acoustic, visual, mechanical, and olfactory deterrents. Real-time feedback processing may ensure continuous model improvement. A wireless network module can enable cloud-based monitoring and user interaction.
[0013] The method can include detecting wildlife through motion sensors and activating camera modules for image capture. Neural network models can process images to classify species and retrieve behavioral parameters. The system can generate species-specific deterrent combinations through multimodal actuators. Recovery zones with supplemental resources can reduce wildlife stress. The process can incorporate user feedback for neural network updates, ensuring adaptive learning and improved deterrence effectiveness over time.
[0014] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF DRAWINGS
[0015] The accompanying drawings are included to provide a further understanding of the present disclosure and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure. The diagrams are for illustration only, which thus is not a limitation of the present disclosure.
[0016] FIG. 1 illustrates a block diagram of a wildlife deterrent system, in accordance with an embodiment of the present disclosure.
[0017] FIG. 2 illustrates a schematic representation of the weatherproof enclosure and its components, in accordance with an embodiment of the present disclosure.
[0018] FIG. 3 illustrates a control unit architecture, in accordance with an embodiment of the present disclosure.
[0019] FIG. 4 illustrates a flow diagram depicting a method for deterring wildlife from designated areas, in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[0020] The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth.
[0021] In an embodiment of the present disclosure, the present invention relates to a wildlife deterrent system (102) for protecting designated areas from animal intrusion, the system including at least one weatherproof enclosure (202) positioned to house a plurality of sensors (204) and deterrent actuators (206) controlled by a central control unit (212). The wildlife deterrent system (102) can provide automated, species-specific deterrence responses through intelligent processing of sensor data and activation of appropriate deterrent mechanisms.
[0022] In an embodiment of the present disclosure, the present disclosure relates to a wildlife deterrent system (102) for protecting designated areas from animal intrusion, the system including at least one weatherproof enclosure (202) positioned to house a plurality of sensors (204) including at least one motion sensor and at least one image capture device, a plurality of deterrent actuators (206), and a control unit (212) electrically connected to the plurality of sensors (204) and the plurality of deterrent actuators (206). The wildlife deterrent system (102) can provide automated, species-specific deterrence responses through intelligent processing of sensor data and activation of appropriate deterrent mechanisms. The system can address the technical problem of inadequate wildlife detection accuracy and non-specific deterrence methods in existing agricultural protection systems. The technical solution can involve implementing a neural network-based classification system that can generate species-specific deterrent responses with confidence scores above 0.99, thereby reducing false positives and improving deterrent effectiveness.
[0023] FIG. 1 illustrates a block diagram of a wildlife deterrent system, in accordance with an embodiment of the present disclosure.
[0024] In an embodiment, referring to FIG. 1, the wildlife deterrent system (102) can operate within a networked environment (100) that facilitates communication between the system and multiple user equipment devices (108-1, 108-2, 108-N). The system (102) can communicate through a network (104) with a centralized server (110), enabling remote monitoring and control capabilities through a web dashboard (112) accessible to users (106-1, 106-2, 106-N). The networked environment (100) can provide scalable deployment options for multiple wildlife deterrent systems (102) across different geographical locations.
[0025] In an embodiment of the present disclosure, the network (104) can provide bidirectional communication pathways, allowing the wildlife deterrent system (102) to transmit detection data including the identified category and timestamp to at least one user equipment through a wireless network module (104). The centralized server (110) can aggregate data from multiple wildlife deterrent systems (102) deployed across different locations, enabling comprehensive monitoring of wildlife activity patterns and system performance metrics. The web dashboard (112) can provide real-time visualization of detection events, system status, and statistical analysis of wildlife behavior patterns.
[0026] FIG. 2 illustrates a schematic representation of a weatherproof enclosure and its components, in accordance with an embodiment of the present disclosure.
[0027] In an exemplary embodiment, as illustrated in FIG. 2, the weatherproof enclosure (202) can serve as a protective housing for both input and output components of the wildlife deterrent system (102). The weatherproof enclosure (202) can be specifically engineered to withstand outdoor environmental conditions including moisture, dust, temperature variations, and physical impacts while maintaining optimal operating conditions for internal electronics. The enclosure (202) can incorporate IP65 or higher ingress protection ratings to ensure reliable operation in diverse weather conditions.
[0028] In an embodiment, the weatherproof enclosure (202) can incorporate ventilation systems and thermal management features to prevent condensation and maintain stable operating temperatures for electronic components. The enclosure (202) may include corrosion-resistant materials such as aluminum alloy or stainless steel to ensure long-term durability in outdoor environments. The enclosure (202) can further include electromagnetic interference shielding to protect sensitive electronic components from external electrical interference.
[0029] In an embodiment of the present disclosure, the plurality of sensors (204) housed within the weatherproof enclosure (202) can include at least one motion sensor (204-1), at least one camera module (204-2), and at least one environmental sensor (204-3). Each sensor component can contribute specialized detection capabilities to overall system functionality. The sensor array can provide multi-modal detection capabilities that can enhance accuracy and reduce false positives compared to single-sensor approaches.
[0030] In an exemplary embodiment, at least one motion sensor (204-1) can be positioned to detect motion within a predetermined range of 30 centimeters to 7.5 meters. The motion sensor (204-1) can utilize various detection technologies including ultrasonic waves at 40 kHz frequency, passive infrared detection, LiDAR, or HC-SR04 based sensors, continuously monitoring the detection area and generating detection signals when motion is detected, with adaptive sensitivity adjustment based on environmental conditions.
[0031] In an embodiment of the present disclosure, at least one camera module (204-2) can be positioned to capture images of detected objects upon receiving an activation signal from the control unit (212). The camera module (204-2) can incorporate high-resolution imaging capabilities with automatic focus and exposure adjustment features to ensure optimal image quality for species identification. The camera module (204-2) may include infrared illumination capabilities for low-light operation and can support various image formats for processing flexibility.
[0032] In an exemplary embodiment, the camera module (204-2) can be positioned at optimal angles within the weatherproof enclosure (202) to ensure comprehensive coverage of a monitored area while minimizing blind spots and occlusion effects. The camera module (204-2) can capture images in various lighting conditions through adaptive exposure controls and infrared illumination capabilities. The camera module (204-2) may include image stabilization features to reduce blur caused by environmental vibrations or movement.
[0033] The environmental sensors (204-3) monitor ambient conditions including temperature, humidity, light levels, atmospheric pressure, wind speed and direction to provide contextual data that enables the wildlife deterrent system (102) to adapt its responses based on current environmental parameters. The sensors (204-3) continuously collect data on weather conditions, seasonal variations, and daily cycles, allowing the system to adjust detection sensitivity and deterrent strategies while supporting predictive modules that anticipate wildlife behavior patterns and provide data logging capabilities for long-term behavioral analysis and system optimization.
[0034] In an embodiment of the present disclosure, the plurality of deterrent actuators (206) housed within the weatherproof enclosure (202) can include at least one multimodal sound unit (206-1) to generate sound waves at multiple frequency ranges, at least one LED light array (206-2) to produce visual deterrence, at least one mechanical shaker (206-3) to create motion-based deterrence, and at least one olfactory deterrent (206-4) to release deterrent substances. The deterrent actuators (206) can provide comprehensive sensory deterrence targeting multiple animal sensory modalities simultaneously.
[0035] In an exemplary embodiment, at least one multimodal sound unit (206-1) can include at least one piezoelectric transducer receiving PWM signals from the control unit (212) through interfaces (306). The multimodal sound unit (206-1) can generate mechanical vibrations at 20-45 kHz through voltage-induced deformation of a piezoelectric element when a deterrent module (316) selects ultrasonic deterrence for small mammals. The piezoelectric transducer can provide precise frequency control and may generate high-intensity acoustic signals with minimal power consumption.
[0036] In an embodiment, the multimodal sound unit (206-1) can generate pressure waves at 1-20 Hz through controlled oscillation of a transducer assembly when the deterrent module (316) selects infrasonic deterrence for large mammals. The unit (206-1) can produce acoustic emissions at 1-8 kHz through diaphragm displacement when the deterrent module (316) selects audible deterrence, where the control unit (212) may switch between frequency ranges based on behavioral parameters retrieved from a behavioral module (318). The multimodal sound unit (206-1) can incorporate directional acoustic focusing to concentrate sound energy toward detected animals.
[0037] In an embodiment of the present disclosure, a piezoelectric transducer within the multimodal sound unit (206-1) can respond to PWM signals that vary in frequency and duty cycle based on species classification. The transducer assembly can incorporate multiple resonant chambers and acoustic focusing elements to optimize sound propagation and directional control for species-specific deterrence. The transducer may include protective housing to prevent damage from environmental exposure while maintaining acoustic performance.
[0038] In an exemplary embodiment, at least one LED light array (206-2) can produce visual deterrence through programmable light patterns at predetermined wavelengths. The LED light array (206-2) can incorporate multiple LED elements capable of producing intense blue light at 450-495 nm wavelengths for species that demonstrate sensitivity to this spectrum, and white light at 400-700 nm for broader visual deterrence applications. The LED array can provide precise wavelength control and may generate high-intensity illumination patterns optimized for specific animal visual sensitivities.
[0039] In an embodiment, the LED light array (206-2) can support programmable flash patterns, strobing effects, and continuous illumination modes based on species-specific behavioral responses. The array (206-2) can incorporate thermal management systems to maintain optimal LED performance and may prevent overheating during extended operation periods. The LED light array (206-2) can include diffusion optics to create uniform illumination patterns and may incorporate color-changing capabilities for enhanced deterrent effectiveness.
[0040] In an embodiment of the present disclosure, at least one mechanical shaker (206-3) can include a brushed DC motor drawing 2-5 amperes at 12 volts to rotate a shaft-mounted element through 360-degree cycles at frequencies between 0.5 Hz to 1 Hz. The mechanical shaker (206-3) can create motion-based deterrence through controlled physical movement mechanisms. The motor can provide precise speed control and may generate vibrations that simulate predator movement or territorial marking behaviors.
[0041] In an exemplary embodiment, the mechanical shaker (206-3) can incorporate variable speed control and directional rotation capabilities, allowing the system to generate different motion patterns based on detected species and behavioral requirements. The shaker (206-3) can include vibration isolation mounting to prevent mechanical stress on the weatherproof enclosure (202). The mechanical shaker (206-3) may incorporate position sensors to provide feedback on actuator movement and ensure proper operation.
[0042] In an embodiment, at least one olfactory deterrent (206-4) can release deterrent substances based on species-specific sensory characteristics and behavioral responses. The olfactory deterrent (206-4) can incorporate controlled release mechanisms for dispensing various chemical substances including natural and synthetic scent deterrents. The olfactory system can provide targeted chemical deterrence that may exploit specific animal aversion responses to predator scents or territorial markers.
[0043] In an embodiment of the present disclosure, the olfactory deterrent (206-4) can include reservoir systems for storing multiple deterrent substances and programmable dispensing controls for releasing appropriate chemical substances based on species identification results. The system (206-4) can incorporate safety features to prevent accidental exposure and may ensure environmental compatibility. The olfactory deterrent (206-4) can include concentration control mechanisms to optimize deterrent effectiveness while minimizing environmental impact.
[0044] FIG. 3 illustrates a control unit architecture, in accordance with an embodiment of the present disclosure.
[0045] In an exemplary embodiment, referring to FIG. 3, the control unit (212) can serve as a central processing hub of the wildlife deterrent system (102), including a processor (302), a memory (304) coupled to the processor (302), and interfaces (306). The control unit (212) can coordinate all system operations including sensor data processing, species identification, and deterrent activation. The control unit (212) may provide real-time processing capabilities that enable rapid response to wildlife detection events.
[0046] In an embodiment, the processor (302) can execute 1.5 billion instructions per second through four processing cores, providing computational capabilities for real-time image processing and neural network inference. The processor (302) can incorporate specialized instruction sets for digital signal processing and machine learning operations, optimizing performance for wildlife detection and classification tasks. The processor (302) may include dedicated hardware acceleration for neural network computations to reduce processing latency.
[0047] In an embodiment of the present disclosure, the processor (302) can support parallel processing architectures that enable simultaneous execution of multiple system functions including sensor monitoring, image analysis, and communication protocols. The processor (302) can incorporate power management features that may optimize energy consumption while maintaining processing performance. The processor (302) can include error correction capabilities to ensure reliable operation in challenging environmental conditions.
[0048] In an exemplary embodiment, the memory (304) can store a trained neural network model including convolutional layers with 11,141,405 parameters executing forward propagation at 5.536 milliseconds per image. The memory (304) can transfer data at 3200 MT/s through a 64-bit bus interface, ensuring high-speed access to neural network parameters and processing data. The memory (304) may provide sufficient storage capacity for multiple neural network models and historical detection data.
[0049] In an embodiment of the present disclosure, interfaces (306) can facilitate communication between the processor (302) and external components including the plurality of sensors (204) and the plurality of deterrent actuators (206). The interfaces (306) can manage PWM signal generation for actuator control and may handle bidirectional data transmission between system components. The interfaces (306) can provide electrical isolation to protect sensitive components from environmental interference.
[0050] In an embodiment, a processing engine (308) can include multiple specialized modules that coordinate to analyze sensor data and generate appropriate deterrent responses. The processing engine (308) can include a detection module (312), classification module (314), behavioral module (318), and deterrent module (316). The processing engine (308) may implement pipeline processing architecture to optimize throughput and reduce response latency.
[0051] In an embodiment of the present disclosure, the detection module (312) can process captured images to identify object presence within a monitored area. The detection module (312) can incorporate edge detection, feature extraction, and object segmentation modules to isolate potential wildlife subjects from background imagery. The detection module (312) may implement adaptive thresholding techniques to optimize detection performance across varying lighting conditions.
[0052] In an exemplary embodiment, the detection module (312) can analyze image data in real-time, identifying motion patterns, shape characteristics, and size parameters that indicate wildlife presence. The module (312) can apply filtering modules to reduce false positives caused by environmental factors such as vegetation movement or weather conditions. The detection module (312) may incorporate temporal analysis to distinguish between transient environmental changes and persistent wildlife presence.
[0053] In an embodiment, the classification module (314) can process captured images through the detection module (312) and the classification module (314) to identify and classify detected objects into predetermined categories based on extracted visual features. The classification module (314) can generate confidence scores above 0.99 for detected objects, ensuring high accuracy in species identification. The classification module (314) may implement ensemble learning techniques combining multiple neural network models for improved accuracy.
[0054] In an embodiment of the present disclosure, the classification module (314) can incorporate deep learning modules trained on species-specific datasets, enabling accurate identification of local wildlife species commonly encountered in agricultural environments. The module (314) can process multiple image frames to improve classification accuracy and may reduce misidentification rates. The classification module (314) can implement transfer learning techniques to adapt to new species or environmental conditions.
[0055] In an exemplary embodiment, the behavioral module (318) can access lookup tables containing deterrent parameters for 15 distinct taxonomic classifications, where each classification may link to frequency values, pulse durations, and actuator sequencing data stored as hexadecimal values. The behavioral module (318) can determine actuator parameters corresponding to an identified category by accessing a stored mapping between categories and deterrent responses. The behavioral module (318) may implement adaptive learning modules that update deterrent parameters based on observed effectiveness.
[0056] In an embodiment, the behavioral module (318) can incorporate adaptive modules that adjust deterrent parameters based on observed effectiveness and environmental conditions. The module (318) can maintain historical response data to optimize deterrent strategies over time. The behavioral module (318) may implement machine learning techniques to identify patterns in animal behavior and optimize deterrent timing and intensity.
[0057] In an embodiment of the present disclosure, the deterrent module (316) can generate one or more control signals to actuate selected combinations of the plurality of deterrent actuators (206) based on determined actuator parameters, where the actuated deterrents may produce stimuli including acoustic signals at species-specific frequencies, light patterns at predetermined wavelengths, mechanical movements, and chemical substances. The deterrent module (316) can coordinate simultaneous activation of multiple deterrent modalities for enhanced effectiveness.
[0058] In an exemplary embodiment, the deterrent module (316) can coordinate simultaneous activation of multiple deterrent actuators (206) to create comprehensive deterrent responses. The module (316) can incorporate timing controls and sequencing modules to optimize deterrent effectiveness while minimizing habituation effects. The deterrent module (316) may implement randomization modules to vary deterrent patterns and prevent animal adaptation.
[0059] In an embodiment, a communication module (320) can transmit detection data including an identified category and timestamp to at least one user equipment (108-1, 108-2, 108-N) via a wireless network module (104). The communication module (320) can package detection data including identified categories, confidence scores, and timestamps for transmission to user equipment devices. The communication module (320) may implement data compression techniques to optimize bandwidth utilization in wireless transmission.
[0060] In an embodiment of the present disclosure, the control unit (212) can execute interrupt service routines to write detection events including species classification and sensor data to non-volatile memory addresses with 32-bit timestamps. The system can increment counter registers corresponding to written species classification, where each counter register may be linked to specific memory addresses. The control unit (212) can implement real-time operating system capabilities for deterministic response timing.
[0061] In an exemplary embodiment, when incremented counter registers reach predetermined thresholds, the system can compile detection events from memory addresses into data packets of 1024 bytes. The compiled data packets can be transmitted through TCP port 8080 via the communication module (320) to a centralized server (110). The system may implement data integrity checking to ensure reliable transmission of detection data. The compiled data packets can be transmitted through TCP port 8080 via the communication module (320) to a centralized server (110) for data aggregation and analysis.
[0062] In an embodiment, the system can implement a cloud-based retraining process wherein the control unit (212) stores captured field images with detection results in non-volatile memory and transmits them to the centralized server (110). Users can provide feedback through the web dashboard (112) by tagging detections as correct or incorrect. The system can receive this user feedback data through TCP port 8080 and may store feedback in a feedback counter register. When a feedback counter register reaches 1000 iterations, the centralized server (110) can retrain the deep learning model using the accumulated field data and feedback, then transmit updated neural network parameters back to the control unit (212) to improve detection accuracy and ensure greater alignment with real-world conditions over time, where the modules can access stored detection events and corresponding feedback data from non-volatile memory to calculate error gradients and update network weights. The system may implement incremental learning techniques to continuously improve classification accuracy.
[0063] In an exemplary embodiment of the cloud-based retraining process, the control unit (212) can maintain a local cache of captured field images along with their initial classification results. The communication module (320) can periodically synchronize this cache with the centralized server (110) during low-traffic periods. The web dashboard (112) can present these field images to authorized users for verification, allowing them to tag detections as correct or incorrect. This crowd-sourced feedback mechanism can enhance the quality of training data and enable the neural network model to adapt to regional wildlife variations and seasonal behavioral changes.
[0064] In an embodiment of the present disclosure, the wildlife deterrent system (102) can further include at least one designated zone positioned 50-100 meters from a protected area boundary. The designated zone can include at least one motion sensor (such as HC-SR04) capable of transmitting detection pulses to detect wildlife presence and communicating detection data to the control unit (212) through a secondary wireless link. The designated zone may provide a humane alternative habitat for deterred wildlife.
[0065] In an exemplary embodiment, the designated zone can include at least one water reservoir with float valve mechanism activated by the control unit (212) when an HC-SR04 sensor detects wildlife entry into the designated zone. The water reservoir can provide controlled water access for deterred animals, supporting humane wildlife management practices. The water reservoir may include filtration systems to maintain water quality and prevent contamination.
[0066] In an embodiment of the present disclosure, the wildlife deterrent system (102) can be deployed as part of a comprehensive agroforestry-supported implementation for long-term impact. The designated recovery zones can be enhanced through environmental regeneration efforts including sowing or planting native crops that provide sustainable food sources for the wildlife species identified by the classification module (314). The system can utilize data from the environmental sensors (204-3) and historical detection patterns to recommend optimal planting schedules and crop varieties through the web dashboard (112). This comprehensive approach not only enhances the guarantee of farm protection but also contributes to mitigating human-wildlife conflict by promoting peaceful and ecologically sustainable coexistence between agricultural activities and wildlife populations.
[0067] In an embodiment, the designated zone can include at least one audio transducer connected to the control unit (212) and positioned to emit species-specific recorded vocalizations corresponding to species identified by the classification module (314). For example, when an elephant is detected, the audio transducer emits recordings of calm female elephant vocalizations to her group at 10-30 Hz, designed to reduce stress and encourage natural movement toward the designated zone. The control unit (212) can activate the audio transducer simultaneously with deterrent actuators (206) to guide deterred wildlife toward the designated zone where food and water resources are provided.The audio transducer may incorporate directional sound projection to guide animals toward safe areas.
[0068] FIG. 4 illustrates a flow diagram depicting a method for deterring wildlife from designated areas, in accordance with an embodiment of the present disclosure.
[0069] In an embodiment of the present disclosure, referring to FIG. 4, a method for deterring wildlife from designated areas can be performed by a wildlife deterrent system (102) including at least one weatherproof enclosure (202), a plurality of sensors (204), a plurality of deterrent actuators (206), and a control unit (212). The method can provide automated wildlife detection and species-specific deterrent activation for agricultural protection applications.
[0070] In an exemplary embodiment, the method can include detecting (402) motion through a motion sensor (204-1) within a predetermined detection range. The motion sensor (204-1) can continuously monitor a detection area and may generate detection signals when motion patterns indicate potential wildlife presence. The detection process can operate continuously to provide real-time monitoring capabilities.
[0071] In an embodiment, the method can include sending (404) detection signals from the motion sensor (204-1) to the control unit (212) for processing and analysis. The control unit (212) can evaluate detection signals to determine if activation of additional sensors is required. The signal processing can implement noise filtering to reduce false triggers from environmental factors.
[0072] In an embodiment of the present disclosure, the method can include activating a camera module (204-2) through the control unit (212) upon receiving detection signals, proceeding to capturing (406) images through an activated camera module (204-2) and transferring to memory (304) for analysis. The image capture process can optimize exposure settings based on ambient lighting conditions to ensure high-quality images for species identification.
[0073] In an exemplary embodiment, the method can include processing (408) images sequentially through a detection module (312) to identify object presence, then through a classification module (314) to classify into species categories based on extracted visual features. The processing pipeline can implement parallel processing techniques to reduce latency and may provide real-time species identification capabilities.
[0074] In an embodiment, the method can include querying (410) a behavioral module (318) with classification results to retrieve species-specific parameters from a database (310). The behavioral module (318) can access stored behavioral data corresponding to identified species. The parameter retrieval process can implement caching mechanisms to optimize response speed for frequently detected species.
[0075] In an embodiment of the present disclosure, the method can include transferring (412) parameters to a deterrent module (316) to determine actuator combinations and response intensities appropriate for detected species. The deterrent module (316) can generate control signals for simultaneous activation of selected deterrent actuators (206). The parameter transfer process can implement validation checks to ensure appropriate deterrent selection.
[0076] In an exemplary embodiment, the method can include generating (414) control signals through the deterrent module (316) and transmitting via interfaces (306) to simultaneously activate a multimodal sound unit (206-1) for acoustic deterrence, an LED light array (206-2) for visual deterrence, a mechanical shaker (206-3) for motion deterrence, and an olfactory deterrent (206-4) for chemical deterrence. The control signal generation can implement precise timing coordination to optimize deterrent effectiveness.
[0077] In an embodiment, the method can include packaging (416) detection data in a communication module (320) and transmitting through a wireless network module (104) across a network (104) to user equipment (108-1, 108-2, 108-N), providing real-time notification of wildlife detection events. The data packaging process can implement compression modules to optimize transmission efficiency.
[0078] In an embodiment of the present disclosure, the method can include receiving (418) feedback from user equipment through the same network path back to the communication module (320), and updating (420) neural network parameters in the classification module (314) based on received feedback, thereby improving classification accuracy over time. The feedback processing can implement incremental learning techniques to continuously enhance system performance.
[0079] In an exemplary embodiment, the wildlife deterrent system (102) incorporates advanced power management systems through a power input (210) supporting multiple power sources including AC mains, battery systems, and solar panels with intelligent switching to ensure continuous operation, and a status indicator (216) providing visual feedback on system operation status, power levels, and component functionality through LED displays and audible alerts, enabling efficient monitoring and maintenance operations with remote monitoring capabilities for centralized system management. The system's design supports integration with comprehensive agroforestry practices, enabling deployment as part of broader environmental regeneration initiatives that create sustainable ecosystems for long-term human-wildlife coexistence.

ADVANTAGES OF THE PRESENT DISCLOSURE
[0080] An advantage of the present disclosure is that the wildlife deterrent system can achieve species-specific deterrence through neural network-based classification, enabling precise identification of detected objects and activation of appropriate deterrent responses based on identified species characteristics, thereby providing more effective deterrence compared to conventional non-specific methods.
[0081] Another advantage of the present disclosure is that the wildlife deterrent system can maintain continuous protection effectiveness through cloud-based retraining processes with user feedback mechanisms and remote monitoring capabilities, enabling scalable deployment across agricultural areas with real-time notification and automated parameter updates.
, Claims:1. A wildlife deterrent system (102) for protecting designated areas from animal intrusion, the system comprising:
at least one weatherproof enclosure (202) positioned to house a plurality of sensors (204) comprising at least one motion sensor and at least one image capture device;
a wireless network module (104) positioned to establish communication with external networks and electrically connected to the control unit (212);
a communication module (320) positioned to coordinate data transmission between system components and electrically connected to the control unit (212) and the wireless network module (104);
at least one motion sensor (204-1) positioned to detect motion within a predetermined range and electrically connected to the control unit (212) through a first interface (306);
at least one camera module (204-2) positioned to capture images of detected objects upon receiving an activation signal and communicatively connected to the control unit (212);
at least one environmental sensor (204-3) positioned to monitor ambient conditions and communicatively connected to the control unit (212) through a third interface (306);
a plurality of deterrent actuators (206) comprising at least one multimodal sound unit (206-1) to generate sound waves at multiple frequency ranges, at least one LED light array (206-2) to produce visual deterrence, at least one mechanical shaker (206-3) to create motion-based deterrence, and at least one olfactory deterrent (206-4) to release deterrent substances, wherein each deterrent actuator is electrically connected to and controlled by the control unit (212);
the control unit (212) comprising a processor (302), a memory (304) coupled to the processor (302), wherein the memory (304) stores a trained neural network model and instructions which, when executed by the processor (302), cause the control unit (212) to:
receive detection signals from the at least one motion sensor (204-1) through the first interface (306) when motion is detected within the predetermined range;
activate the at least one camera module (204-2) in response to the detection signals;
process captured images through a detection module (312) and a classification module (314) to identify and classify detected objects into predetermined categories based on extracted visual features;
determine actuator parameters through a behavioral module (318) and a deterrent module (316) corresponding to the identified category by accessing a stored mapping between categories and deterrent responses;
generate one or more control signals to actuate selected combinations of the plurality of deterrent actuators (206) based on the determined actuator parameters; and
transmit detection data comprising the identified category and timestamp to at least one user equipment (108-1, 108-2, 108-N) via a wireless network module (104) through a communication module (320).
2. The wildlife deterrent system (102) as claimed in claim 1, wherein the at least one motion sensor (204-1) is configured to detect motion using technologies including ultrasonic wave transmission at 40 kHz frequency with detection range of 30 centimeters to 7.5 meters, or HC-SR04 sensor modules, wherein the control unit (212) processes detection signals from the motion sensor (204-1) through the first interface (306).
3. The wildlife deterrent system (102) as claimed in claim 1, wherein the trained neural network model comprises convolutional layers with 11,141,405 parameters executing forward propagation at 5.536 milliseconds per image, and wherein the classification module (314) generates confidence scores above 0.99 for detected objects.
4. The wildlife deterrent system (102) as claimed in claim 1, wherein the multimodal sound unit (206-1) comprises at least one piezoelectric transducer receiving PWM signals from the control unit (212) through an interface (306), wherein the PWM signals vary in frequency and duty cycle based on the species classification to generate:
mechanical vibrations at 20-45 kHz through voltage-induced deformation of the piezoelectric element when the deterrent module (316) selects ultrasonic deterrence for small mammals;
pressure waves at 1-20 Hz through controlled oscillation of the transducer assembly when the deterrent module (316) selects infrasonic deterrence for large mammals; and
acoustic emissions at 1-8 kHz through diaphragm displacement when the deterrent module (316) selects audible deterrence, wherein the control unit (212) switches between these frequency ranges based on the behavioral parameters retrieved from the behavioral module (318).
5. The wildlife deterrent system (102) as claimed in claim 1, wherein the processor (302) executes 1.5 billion instructions per second through four processing cores, and wherein the memory (304) transfers data at 3200 MT/s through a 64-bit bus interface.
6. The wildlife deterrent system (102) as claimed in claim 1, wherein the mechanical shaker (206-3) comprises a brushed DC motor drawing 2-5 amperes at 12 volts to rotate a shaft-mounted element through 360-degree cycles at frequencies between 0.5 Hz to 1 Hz.
7. The wildlife deterrent system (102) as claimed in claim 1, further comprising at least one designated zone positioned 50-100 meters from the protected area boundary, wherein the zone comprises:
at least one HC-SR04 motion sensor transmitting 40 kHz pulses to detect wildlife presence and communicating detection data to the control unit (212) through a secondary wireless link;
at least one water reservoir with float valve mechanism activated by the control unit (212) when the HC-SR04 sensor detects wildlife entry into the designated zone; and
at least one audio transducer connected to the control unit (212) and positioned to emit species-specific recorded vocalizations corresponding to the species identified by the classification module (314), wherein for elephant detection, the audio transducer emits recordings of calm female elephant vocalizations to reduce stress and encourage movement toward the designated zone, wherein the control unit (212) activates the audio transducer simultaneously with the deterrent actuators (206) to guide deterred wildlife toward the designated zone.
8. The wildlife deterrent system (102) as claimed in claim 1, wherein the behavioral module (318) accesses lookup tables containing deterrent parameters for 15 distinct taxonomic classifications, wherein each classification links to frequency values, pulse durations, and actuator sequencing data stored as hexadecimal values.
9. The wildlife deterrent system (102) as claimed in claim 1, wherein the control unit (212) executes interrupt service routines to:
write detection events comprising species classification and sensor data to non-volatile memory addresses with 32-bit timestamps;
increment counter registers corresponding to the written species classification, wherein each counter register is linked to specific memory addresses;
compile detection events from the memory addresses into data packets of 1024 bytes when the incremented counter registers reach predetermined thresholds;
transmit the compiled data packets through TCP port 8080 via the communication module (320) to a centralized server (110);
receive user feedback data through the same TCP port and store in a feedback counter register;
execute backpropagation modules on the neural network model when the feedback counter register reaches 1000 iterations, wherein the modules access the stored detection events and corresponding feedback data from the non-volatile memory to calculate error gradients and update network weights,
wherein the backpropagation modules are executed as part of a cloud-based retraining process, enabling the centralized server (110) to retrain the deep learning model using accumulated field images and user feedback tags, and subsequently download updated model parameters to improve real-world detection accuracy.
10. A method for deterring wildlife from designated areas, the method performed by a wildlife deterrent system (102) comprising at least one weatherproof enclosure (202), a plurality of sensors (204), a plurality of deterrent actuators (206), and a control unit (212), the method comprising:
detecting (402) motion through the motion sensor (204-1);
sending (404) detection signals from the motion sensor (204-1) to the control unit (212);
activating the camera module (204-2) through the control unit (212) upon receiving the detection signals;
capturing (406) images through the activated camera module (204-2) and transferring to the memory (304);
processing (408) the images sequentially through the detection module (312) to identify object presence, then through the classification module (314) to classify into species categories;
querying (410) the behavioral module (318) with the classification result to retrieve species-specific parameters from the database (310);\
transferring (412) the parameters to the deterrent module (316) to determine actuator combinations;
generating (414) control signals through the deterrent module (316) and transmitting via the interface (306) to simultaneously activate the multimodal sound unit (206-1) for acoustic deterrence, the LED light array (206-2) for visual deterrence, the mechanical shaker (206-3) for motion deterrence, and the olfactory deterrent (206-4) for chemical deterrence;
packaging (416) detection data in the communication module (320) and transmitting through the wireless network module (104) across the network (104) to user equipment (108-1, 108-2, 108-N);
receiving (418) feedback from the user equipment through the same network path back to the communication module (320); and
updating (420) neural network parameters in the classification module (314) based on the received feedback.

Documents

Application Documents

# Name Date
1 202541076803-STATEMENT OF UNDERTAKING (FORM 3) [12-08-2025(online)].pdf 2025-08-12
2 202541076803-REQUEST FOR EXAMINATION (FORM-18) [12-08-2025(online)].pdf 2025-08-12
3 202541076803-REQUEST FOR EARLY PUBLICATION(FORM-9) [12-08-2025(online)].pdf 2025-08-12
4 202541076803-FORM-9 [12-08-2025(online)].pdf 2025-08-12
5 202541076803-FORM FOR SMALL ENTITY(FORM-28) [12-08-2025(online)].pdf 2025-08-12
6 202541076803-FORM 18 [12-08-2025(online)].pdf 2025-08-12
7 202541076803-FORM 1 [12-08-2025(online)].pdf 2025-08-12
8 202541076803-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [12-08-2025(online)].pdf 2025-08-12
9 202541076803-EVIDENCE FOR REGISTRATION UNDER SSI [12-08-2025(online)].pdf 2025-08-12
10 202541076803-EDUCATIONAL INSTITUTION(S) [12-08-2025(online)].pdf 2025-08-12
11 202541076803-DRAWINGS [12-08-2025(online)].pdf 2025-08-12
12 202541076803-DECLARATION OF INVENTORSHIP (FORM 5) [12-08-2025(online)].pdf 2025-08-12
13 202541076803-COMPLETE SPECIFICATION [12-08-2025(online)].pdf 2025-08-12
14 202541076803-Proof of Right [10-11-2025(online)].pdf 2025-11-10
15 202541076803-FORM-26 [10-11-2025(online)].pdf 2025-11-10